Oct 03 06:56:05 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 06:56:05 crc restorecon[4805]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:05 crc restorecon[4805]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 06:56:06 crc restorecon[4805]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 06:56:07 crc kubenswrapper[4810]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.023252 4810 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030197 4810 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030232 4810 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030242 4810 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030253 4810 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030265 4810 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030276 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030285 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030297 4810 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030309 4810 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030318 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030327 4810 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030349 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030358 4810 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030366 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030375 4810 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030384 4810 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030393 4810 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030401 4810 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030410 4810 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030418 4810 feature_gate.go:330] unrecognized feature gate: Example Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030426 4810 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030436 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030446 4810 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030456 4810 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030468 4810 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030480 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030491 4810 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030500 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030509 4810 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030519 4810 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030528 4810 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030538 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030547 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030555 4810 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030564 4810 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030573 4810 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030582 4810 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030591 4810 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030600 4810 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030611 4810 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030620 4810 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030630 4810 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030640 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030649 4810 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030660 4810 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030669 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030678 4810 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030686 4810 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030695 4810 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030704 4810 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030712 4810 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030720 4810 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030729 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030737 4810 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030746 4810 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030754 4810 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030766 4810 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030775 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030783 4810 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030793 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030801 4810 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030812 4810 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030822 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030832 4810 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030841 4810 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030850 4810 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030858 4810 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030868 4810 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030877 4810 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030886 4810 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.030920 4810 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031060 4810 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031077 4810 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031093 4810 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031105 4810 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031119 4810 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031129 4810 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031141 4810 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031153 4810 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031163 4810 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031173 4810 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031183 4810 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031193 4810 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031204 4810 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031214 4810 flags.go:64] FLAG: --cgroup-root="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031223 4810 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031233 4810 flags.go:64] FLAG: --client-ca-file="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031242 4810 flags.go:64] FLAG: --cloud-config="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031252 4810 flags.go:64] FLAG: --cloud-provider="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031261 4810 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031273 4810 flags.go:64] FLAG: --cluster-domain="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031283 4810 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031293 4810 flags.go:64] FLAG: --config-dir="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031303 4810 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031314 4810 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031325 4810 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031335 4810 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031367 4810 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031378 4810 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031388 4810 flags.go:64] FLAG: --contention-profiling="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031398 4810 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031408 4810 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031418 4810 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031427 4810 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031439 4810 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031449 4810 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031459 4810 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031469 4810 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031478 4810 flags.go:64] FLAG: --enable-server="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031488 4810 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031499 4810 flags.go:64] FLAG: --event-burst="100" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031510 4810 flags.go:64] FLAG: --event-qps="50" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031520 4810 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031530 4810 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031539 4810 flags.go:64] FLAG: --eviction-hard="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031551 4810 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031561 4810 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031571 4810 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031581 4810 flags.go:64] FLAG: --eviction-soft="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031590 4810 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031601 4810 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031611 4810 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031620 4810 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031630 4810 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031639 4810 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031649 4810 flags.go:64] FLAG: --feature-gates="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031660 4810 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031670 4810 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031681 4810 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031692 4810 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031702 4810 flags.go:64] FLAG: --healthz-port="10248" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031713 4810 flags.go:64] FLAG: --help="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031723 4810 flags.go:64] FLAG: --hostname-override="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031732 4810 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031744 4810 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031756 4810 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031768 4810 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031780 4810 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031793 4810 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031806 4810 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031818 4810 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031831 4810 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031843 4810 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031858 4810 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031871 4810 flags.go:64] FLAG: --kube-reserved="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031882 4810 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031932 4810 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031948 4810 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031958 4810 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031968 4810 flags.go:64] FLAG: --lock-file="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031978 4810 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.031990 4810 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032003 4810 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032022 4810 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032032 4810 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032046 4810 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032059 4810 flags.go:64] FLAG: --logging-format="text" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032071 4810 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032082 4810 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032092 4810 flags.go:64] FLAG: --manifest-url="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032101 4810 flags.go:64] FLAG: --manifest-url-header="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032114 4810 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032124 4810 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032136 4810 flags.go:64] FLAG: --max-pods="110" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032146 4810 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032155 4810 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032165 4810 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032175 4810 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032185 4810 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032194 4810 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032204 4810 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032224 4810 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032234 4810 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032244 4810 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032253 4810 flags.go:64] FLAG: --pod-cidr="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032263 4810 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032281 4810 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032293 4810 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032306 4810 flags.go:64] FLAG: --pods-per-core="0" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032317 4810 flags.go:64] FLAG: --port="10250" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032328 4810 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032340 4810 flags.go:64] FLAG: --provider-id="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032353 4810 flags.go:64] FLAG: --qos-reserved="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032369 4810 flags.go:64] FLAG: --read-only-port="10255" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032382 4810 flags.go:64] FLAG: --register-node="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032397 4810 flags.go:64] FLAG: --register-schedulable="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032409 4810 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032430 4810 flags.go:64] FLAG: --registry-burst="10" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032441 4810 flags.go:64] FLAG: --registry-qps="5" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032451 4810 flags.go:64] FLAG: --reserved-cpus="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032462 4810 flags.go:64] FLAG: --reserved-memory="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032477 4810 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032490 4810 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032501 4810 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032511 4810 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032522 4810 flags.go:64] FLAG: --runonce="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032533 4810 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032544 4810 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032555 4810 flags.go:64] FLAG: --seccomp-default="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032565 4810 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032574 4810 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032585 4810 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032594 4810 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032604 4810 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032614 4810 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032623 4810 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032633 4810 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032655 4810 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032666 4810 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032676 4810 flags.go:64] FLAG: --system-cgroups="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032686 4810 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032701 4810 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032710 4810 flags.go:64] FLAG: --tls-cert-file="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032720 4810 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032740 4810 flags.go:64] FLAG: --tls-min-version="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032749 4810 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032759 4810 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032769 4810 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032779 4810 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032789 4810 flags.go:64] FLAG: --v="2" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032802 4810 flags.go:64] FLAG: --version="false" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032814 4810 flags.go:64] FLAG: --vmodule="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032825 4810 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.032835 4810 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033081 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033095 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033110 4810 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033143 4810 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033155 4810 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033167 4810 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033179 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033190 4810 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033201 4810 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033210 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033219 4810 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033227 4810 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033235 4810 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033243 4810 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033252 4810 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033261 4810 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033269 4810 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033280 4810 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033291 4810 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033302 4810 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033311 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033321 4810 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033333 4810 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033344 4810 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033355 4810 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033364 4810 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033374 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033383 4810 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033392 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033400 4810 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033410 4810 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033427 4810 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033435 4810 feature_gate.go:330] unrecognized feature gate: Example Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033445 4810 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033453 4810 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033461 4810 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033470 4810 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033479 4810 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033487 4810 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033495 4810 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033503 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033512 4810 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033520 4810 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033529 4810 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033537 4810 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033545 4810 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033553 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033565 4810 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033575 4810 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033586 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033596 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033605 4810 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033614 4810 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033624 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033633 4810 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033642 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033650 4810 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033659 4810 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033668 4810 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033676 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033685 4810 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033693 4810 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033702 4810 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033710 4810 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033725 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033733 4810 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033744 4810 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033755 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033764 4810 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033773 4810 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.033781 4810 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.034779 4810 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.048326 4810 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.048395 4810 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048554 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048583 4810 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048593 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048604 4810 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048614 4810 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048623 4810 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048631 4810 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048639 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048647 4810 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048655 4810 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048663 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048670 4810 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048678 4810 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048687 4810 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048695 4810 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048703 4810 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048711 4810 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048721 4810 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048731 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048741 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048751 4810 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048762 4810 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048776 4810 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048785 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048796 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048835 4810 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048845 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048854 4810 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048863 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048919 4810 feature_gate.go:330] unrecognized feature gate: Example Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048927 4810 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048935 4810 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048943 4810 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048951 4810 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048962 4810 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048970 4810 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048978 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048986 4810 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.048993 4810 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049001 4810 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049009 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049017 4810 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049027 4810 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049037 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049046 4810 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049054 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049062 4810 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049070 4810 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049081 4810 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049092 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049101 4810 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049111 4810 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049122 4810 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049132 4810 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049141 4810 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049150 4810 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049158 4810 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049166 4810 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049175 4810 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049184 4810 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049192 4810 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049201 4810 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049209 4810 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049217 4810 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049226 4810 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049234 4810 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049243 4810 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049251 4810 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049260 4810 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049270 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049285 4810 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.049315 4810 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049578 4810 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049595 4810 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049605 4810 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049614 4810 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049622 4810 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049629 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049637 4810 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049644 4810 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049653 4810 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049661 4810 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049670 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049680 4810 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049692 4810 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049702 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049710 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049718 4810 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049726 4810 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049733 4810 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049742 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049750 4810 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049758 4810 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049766 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049773 4810 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049781 4810 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049789 4810 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049797 4810 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049805 4810 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049812 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049820 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049831 4810 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049838 4810 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049846 4810 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049855 4810 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049864 4810 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049873 4810 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049882 4810 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049922 4810 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049931 4810 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049939 4810 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049947 4810 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049954 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049963 4810 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049971 4810 feature_gate.go:330] unrecognized feature gate: Example Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049978 4810 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049987 4810 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.049994 4810 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050002 4810 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050013 4810 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050022 4810 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050031 4810 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050039 4810 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050049 4810 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050058 4810 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050066 4810 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050074 4810 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050082 4810 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050090 4810 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050097 4810 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050105 4810 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050115 4810 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050125 4810 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050134 4810 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050142 4810 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050150 4810 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050159 4810 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050167 4810 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050175 4810 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050183 4810 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050193 4810 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050202 4810 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.050219 4810 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.050232 4810 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.051474 4810 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.060618 4810 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.060796 4810 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.062558 4810 server.go:997] "Starting client certificate rotation" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.062603 4810 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.062815 4810 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-05 07:48:53.877508683 +0000 UTC Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.062961 4810 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2256h52m46.814553583s for next certificate rotation Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.095955 4810 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.101528 4810 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.133749 4810 log.go:25] "Validated CRI v1 runtime API" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.177907 4810 log.go:25] "Validated CRI v1 image API" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.180652 4810 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.186366 4810 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-06-43-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.186446 4810 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.216835 4810 manager.go:217] Machine: {Timestamp:2025-10-03 06:56:07.213395592 +0000 UTC m=+0.640646387 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:54332209-85c4-4e31-bef8-717ad4ff0760 BootID:cf31509f-a3d9-4881-a83a-a21fceb0f392 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ce:b9:85 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ce:b9:85 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ff:b7:45 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:74:55:61 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:8a:e3:31 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ef:9f:d3 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:40:f1:72 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:ae:5b:0d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:92:b5:0a:ec:0f:ad Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ba:33:5b:4a:54:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.217281 4810 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.217550 4810 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.219141 4810 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.219505 4810 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.219569 4810 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.220125 4810 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.220149 4810 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.220737 4810 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.220801 4810 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.221206 4810 state_mem.go:36] "Initialized new in-memory state store" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.221385 4810 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.225313 4810 kubelet.go:418] "Attempting to sync node with API server" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.225353 4810 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.225385 4810 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.225410 4810 kubelet.go:324] "Adding apiserver pod source" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.225475 4810 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.229769 4810 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.231198 4810 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.231301 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.231452 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.231584 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.231729 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.233935 4810 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235719 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235751 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235761 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235771 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235793 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235803 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235813 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235828 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235840 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235851 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235878 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.235887 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.237023 4810 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.237582 4810 server.go:1280] "Started kubelet" Oct 03 06:56:07 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.242511 4810 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.242523 4810 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.243779 4810 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.244028 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250417 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250454 4810 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250544 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 02:25:12.220727459 +0000 UTC Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250607 4810 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1507h29m4.970124702s for next certificate rotation Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.250851 4810 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250950 4810 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250971 4810 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.250971 4810 server.go:460] "Adding debug handlers to kubelet server" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.251149 4810 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.251886 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.252009 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252256 4810 factory.go:55] Registering systemd factory Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252288 4810 factory.go:221] Registration of the systemd container factory successfully Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252688 4810 factory.go:153] Registering CRI-O factory Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252721 4810 factory.go:221] Registration of the crio container factory successfully Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252813 4810 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.252831 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="200ms" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252864 4810 factory.go:103] Registering Raw factory Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.252913 4810 manager.go:1196] Started watching for new ooms in manager Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.253652 4810 manager.go:319] Starting recovery of all containers Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.256161 4810 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.246:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186ae8cbfb8730f8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 06:56:07.237546232 +0000 UTC m=+0.664796977,LastTimestamp:2025-10-03 06:56:07.237546232 +0000 UTC m=+0.664796977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263743 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263803 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263823 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263845 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263862 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263880 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263908 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263923 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263945 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263959 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263977 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.263992 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264020 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264041 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264061 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264080 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264094 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264113 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264127 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264145 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264162 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264176 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264216 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264233 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264256 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264270 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264293 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264309 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264328 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264347 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264365 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264378 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.264392 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267689 4810 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267772 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267791 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267802 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267814 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267824 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267835 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267846 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267859 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267871 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267880 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267903 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267915 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267926 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267936 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267946 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267956 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267967 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267977 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.267986 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268004 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268017 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268030 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268658 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268679 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268694 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268709 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268722 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268736 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268787 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268800 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268810 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268820 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268831 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268842 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268852 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268863 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268874 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268906 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268918 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268928 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268938 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268948 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268958 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268968 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268978 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268988 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.268997 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269007 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269016 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269026 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269035 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269045 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269056 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269067 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269077 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269088 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269097 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269106 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269116 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269126 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269139 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269150 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269160 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269171 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269181 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269191 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269201 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269214 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269224 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269235 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269246 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269263 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269276 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269288 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269299 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269311 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269322 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269333 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269345 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269356 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269367 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269377 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269388 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269398 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269408 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269420 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269430 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269441 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269452 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269462 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269472 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269482 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269491 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269501 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269510 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269520 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269529 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269538 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269546 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269554 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269563 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269572 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269580 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269589 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269598 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269606 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269615 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269625 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269636 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269645 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269653 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269662 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269670 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269679 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269688 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269697 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269706 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269716 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269727 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269740 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269753 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269763 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269774 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269784 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269794 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269805 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269814 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269824 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269834 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269843 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269852 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269861 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269871 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269881 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269906 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269917 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269929 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269938 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269947 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269957 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269967 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269977 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269988 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.269997 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270005 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270015 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270025 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270034 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270043 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270053 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270062 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270075 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270085 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270094 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270105 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270113 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270122 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270130 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270139 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270148 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270157 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270166 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270175 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270184 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270192 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270202 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270211 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270219 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270229 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270237 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270247 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270256 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270266 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270275 4810 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270285 4810 reconstruct.go:97] "Volume reconstruction finished" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.270293 4810 reconciler.go:26] "Reconciler: start to sync state" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.279645 4810 manager.go:324] Recovery completed Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.296208 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.297831 4810 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.298364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.298460 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.298564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.299827 4810 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.299938 4810 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.300020 4810 state_mem.go:36] "Initialized new in-memory state store" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.301070 4810 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.301148 4810 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.301187 4810 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.301366 4810 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.302098 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.302169 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.321027 4810 policy_none.go:49] "None policy: Start" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.322304 4810 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.322349 4810 state_mem.go:35] "Initializing new in-memory state store" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.351491 4810 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.384730 4810 manager.go:334] "Starting Device Plugin manager" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.385007 4810 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.385038 4810 server.go:79] "Starting device plugin registration server" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.385588 4810 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.385640 4810 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.387062 4810 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.387257 4810 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.387273 4810 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.396086 4810 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.402353 4810 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.402438 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.403936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.403970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.403982 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.404125 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.404481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.404524 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405280 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405311 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405394 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405400 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.405417 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406342 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406405 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406647 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406682 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.406700 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409232 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409263 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409427 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409455 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409705 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409756 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.409797 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411600 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.411674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.412000 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.412071 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.416164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.416219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.416238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.454416 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="400ms" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473646 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473745 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473764 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473803 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473824 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473844 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473919 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473940 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.473965 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.474012 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.474033 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.474048 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.474061 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.486046 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.487983 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.488049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.488068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.488105 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.488664 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575729 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575875 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576046 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576332 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576469 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576607 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576747 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576879 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.577052 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.577144 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575772 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576521 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576175 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576649 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576365 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576784 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.576941 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575935 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.575571 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.577815 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.577945 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.578058 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.577982 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.578296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.578412 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.578422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.578573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.689759 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.691819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.691880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.691940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.691986 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.692632 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.734222 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.751051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.760636 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.789322 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: I1003 06:56:07.796530 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.799825 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a0db911c9bbc471e35fd412af37ecd499a0791777771b315d6aa3ea097b5b4fd WatchSource:0}: Error finding container a0db911c9bbc471e35fd412af37ecd499a0791777771b315d6aa3ea097b5b4fd: Status 404 returned error can't find the container with id a0db911c9bbc471e35fd412af37ecd499a0791777771b315d6aa3ea097b5b4fd Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.802793 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8a3fd804aa6bd21e555fd59d7164e80cb007e344734421b02aaf31e9f39df07a WatchSource:0}: Error finding container 8a3fd804aa6bd21e555fd59d7164e80cb007e344734421b02aaf31e9f39df07a: Status 404 returned error can't find the container with id 8a3fd804aa6bd21e555fd59d7164e80cb007e344734421b02aaf31e9f39df07a Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.820094 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d4fe19f9c5b7c51891bf597908e0f1c68b3ea290a106b773c94573197c2c4f33 WatchSource:0}: Error finding container d4fe19f9c5b7c51891bf597908e0f1c68b3ea290a106b773c94573197c2c4f33: Status 404 returned error can't find the container with id d4fe19f9c5b7c51891bf597908e0f1c68b3ea290a106b773c94573197c2c4f33 Oct 03 06:56:07 crc kubenswrapper[4810]: W1003 06:56:07.821288 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-12da054c57002b04ba80d17eef8cba2ca2acbc422e226cbe59dc9dcc68461eb5 WatchSource:0}: Error finding container 12da054c57002b04ba80d17eef8cba2ca2acbc422e226cbe59dc9dcc68461eb5: Status 404 returned error can't find the container with id 12da054c57002b04ba80d17eef8cba2ca2acbc422e226cbe59dc9dcc68461eb5 Oct 03 06:56:07 crc kubenswrapper[4810]: E1003 06:56:07.856184 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="800ms" Oct 03 06:56:08 crc kubenswrapper[4810]: W1003 06:56:08.073149 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.073254 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.093262 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.095580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.095642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.095660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.095699 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.096205 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.245516 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:08 crc kubenswrapper[4810]: W1003 06:56:08.273478 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.273568 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.305708 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"12da054c57002b04ba80d17eef8cba2ca2acbc422e226cbe59dc9dcc68461eb5"} Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.307373 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4fe19f9c5b7c51891bf597908e0f1c68b3ea290a106b773c94573197c2c4f33"} Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.308699 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4c978964bcb1147384782562c2785010ecae764822af9b9d005fee860b92f730"} Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.310071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8a3fd804aa6bd21e555fd59d7164e80cb007e344734421b02aaf31e9f39df07a"} Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.311538 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a0db911c9bbc471e35fd412af37ecd499a0791777771b315d6aa3ea097b5b4fd"} Oct 03 06:56:08 crc kubenswrapper[4810]: W1003 06:56:08.631739 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.632466 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.657917 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="1.6s" Oct 03 06:56:08 crc kubenswrapper[4810]: W1003 06:56:08.758124 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.758238 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.897296 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.911238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.911331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.911357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:08 crc kubenswrapper[4810]: I1003 06:56:08.911411 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:08 crc kubenswrapper[4810]: E1003 06:56:08.912448 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.245315 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.318748 4810 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149" exitCode=0 Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.318831 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.318971 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.320289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.320331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.320341 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.322435 4810 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094" exitCode=0 Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.322518 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.322575 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324431 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324574 4810 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02" exitCode=0 Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.324713 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.328536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.328569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.328581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.329539 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167" exitCode=0 Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.329613 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.329639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.330596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.330630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.330642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335570 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335603 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335618 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335632 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d"} Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335733 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.335980 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.336872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.336952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.336966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.337110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.337150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:09 crc kubenswrapper[4810]: I1003 06:56:09.337163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:09 crc kubenswrapper[4810]: W1003 06:56:09.885046 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:09 crc kubenswrapper[4810]: E1003 06:56:09.885135 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.245778 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:10 crc kubenswrapper[4810]: E1003 06:56:10.259340 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="3.2s" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.347197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.347269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.347289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.347310 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.349358 4810 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1" exitCode=0 Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.349418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.349562 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.350607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.350745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.350832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.354547 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.354554 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c6ee80464125edb57275df5080d958b19a4423fab16e5e62e4be550c84a514a4"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.355625 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.355682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.355714 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.361668 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.361737 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.361747 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.361750 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f"} Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.361694 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.363335 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.513598 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.514728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.514754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.514763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:10 crc kubenswrapper[4810]: I1003 06:56:10.514782 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:10 crc kubenswrapper[4810]: E1003 06:56:10.515141 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 03 06:56:10 crc kubenswrapper[4810]: W1003 06:56:10.639637 4810 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 03 06:56:10 crc kubenswrapper[4810]: E1003 06:56:10.639721 4810 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.278141 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.365311 4810 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec" exitCode=0 Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.365399 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.365386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec"} Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.366211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.366235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.366242 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372293 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372315 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372335 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372424 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1"} Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.372435 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.373959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374035 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.374695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.375127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.375178 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.375207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:11 crc kubenswrapper[4810]: I1003 06:56:11.537229 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382385 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283"} Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a"} Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31"} Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382558 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382631 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382562 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4"} Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.382411 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384280 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.384650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:12 crc kubenswrapper[4810]: I1003 06:56:12.613187 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.391489 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.391573 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.391483 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c"} Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.391637 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.393310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.393362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.393421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.393444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.393375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.394044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.716135 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.718062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.718134 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.718153 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:13 crc kubenswrapper[4810]: I1003 06:56:13.718185 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.394082 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.395181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.395230 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.395247 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.444867 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.445217 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.447051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.447148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.447167 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.452128 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.537813 4810 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.538014 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.749761 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.750127 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.752067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.752123 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.752143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.883439 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.948703 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.949661 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.951818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.951965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:14 crc kubenswrapper[4810]: I1003 06:56:14.951997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.397741 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.397820 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.399465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.399520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.399540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.400172 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.400235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:15 crc kubenswrapper[4810]: I1003 06:56:15.400254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:16 crc kubenswrapper[4810]: I1003 06:56:16.573826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:16 crc kubenswrapper[4810]: I1003 06:56:16.574131 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:16 crc kubenswrapper[4810]: I1003 06:56:16.575487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:16 crc kubenswrapper[4810]: I1003 06:56:16.575525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:16 crc kubenswrapper[4810]: I1003 06:56:16.575537 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:17 crc kubenswrapper[4810]: E1003 06:56:17.396998 4810 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 06:56:18 crc kubenswrapper[4810]: I1003 06:56:18.359790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 06:56:18 crc kubenswrapper[4810]: I1003 06:56:18.360117 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:18 crc kubenswrapper[4810]: I1003 06:56:18.365000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:18 crc kubenswrapper[4810]: I1003 06:56:18.365067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:18 crc kubenswrapper[4810]: I1003 06:56:18.365086 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:20 crc kubenswrapper[4810]: I1003 06:56:20.419444 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 06:56:20 crc kubenswrapper[4810]: I1003 06:56:20.419648 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:20 crc kubenswrapper[4810]: I1003 06:56:20.421007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:20 crc kubenswrapper[4810]: I1003 06:56:20.421086 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:20 crc kubenswrapper[4810]: I1003 06:56:20.421112 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:21 crc kubenswrapper[4810]: I1003 06:56:21.246220 4810 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 03 06:56:21 crc kubenswrapper[4810]: I1003 06:56:21.363866 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 06:56:21 crc kubenswrapper[4810]: I1003 06:56:21.363991 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 06:56:21 crc kubenswrapper[4810]: I1003 06:56:21.380932 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 06:56:21 crc kubenswrapper[4810]: I1003 06:56:21.381005 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.619690 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.619860 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.620985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.621049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.621066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:22 crc kubenswrapper[4810]: I1003 06:56:22.624169 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:23 crc kubenswrapper[4810]: I1003 06:56:23.419643 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:23 crc kubenswrapper[4810]: I1003 06:56:23.421182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:23 crc kubenswrapper[4810]: I1003 06:56:23.421250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:23 crc kubenswrapper[4810]: I1003 06:56:23.421274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:24 crc kubenswrapper[4810]: I1003 06:56:24.537453 4810 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 06:56:24 crc kubenswrapper[4810]: I1003 06:56:24.537576 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 06:56:26 crc kubenswrapper[4810]: E1003 06:56:26.340219 4810 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.342187 4810 trace.go:236] Trace[795365630]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 06:56:11.618) (total time: 14723ms): Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[795365630]: ---"Objects listed" error: 14723ms (06:56:26.342) Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[795365630]: [14.723671192s] [14.723671192s] END Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.342217 4810 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.343153 4810 trace.go:236] Trace[308867691]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 06:56:14.466) (total time: 11876ms): Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[308867691]: ---"Objects listed" error: 11876ms (06:56:26.343) Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[308867691]: [11.876137841s] [11.876137841s] END Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.343188 4810 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.343323 4810 trace.go:236] Trace[248813706]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 06:56:11.646) (total time: 14696ms): Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[248813706]: ---"Objects listed" error: 14696ms (06:56:26.343) Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[248813706]: [14.696980942s] [14.696980942s] END Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.343351 4810 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 06:56:26 crc kubenswrapper[4810]: E1003 06:56:26.344126 4810 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.344921 4810 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.345093 4810 trace.go:236] Trace[171249312]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 06:56:16.140) (total time: 10204ms): Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[171249312]: ---"Objects listed" error: 10204ms (06:56:26.344) Oct 03 06:56:26 crc kubenswrapper[4810]: Trace[171249312]: [10.204336741s] [10.204336741s] END Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.345116 4810 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387110 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42236->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387175 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42236->192.168.126.11:17697: read: connection reset by peer" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387121 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42242->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387311 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42242->192.168.126.11:17697: read: connection reset by peer" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387519 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.387691 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.388102 4810 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.388136 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.428665 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.430785 4810 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1" exitCode=255 Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.430843 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1"} Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.445114 4810 scope.go:117] "RemoveContainer" containerID="f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1" Oct 03 06:56:26 crc kubenswrapper[4810]: I1003 06:56:26.578177 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.238818 4810 apiserver.go:52] "Watching apiserver" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.241111 4810 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.241576 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d69n4","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.242224 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.242255 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.242316 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.242334 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.242395 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.242619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.242616 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.243105 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.243148 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.243205 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.244867 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.244910 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.244958 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.244966 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.244962 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.245206 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.245342 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.245395 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.245535 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.245610 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.246665 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.248349 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.251705 4810 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.261312 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.275398 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.288571 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.299694 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.327133 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.348607 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351183 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351246 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351272 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351302 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351327 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351353 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351375 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351402 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351420 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351440 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351460 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351478 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351495 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351551 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351570 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351588 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351606 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351627 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351646 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351700 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351725 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351746 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351775 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351798 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351818 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351835 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351854 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351914 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351952 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351976 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351996 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352025 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352094 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352472 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352491 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352507 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352525 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352565 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352587 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352603 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352673 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352705 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352723 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352742 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352758 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352778 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352801 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352816 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352848 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352869 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352910 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352930 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352948 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352968 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352986 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353008 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353045 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353098 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353118 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353137 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353156 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353193 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353211 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353227 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353247 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353264 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353282 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353298 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353318 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353350 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353368 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353385 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353404 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353422 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353441 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353461 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353515 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353532 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353551 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353583 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353601 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353617 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353637 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353688 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353704 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353723 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353742 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353758 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353773 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353790 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353807 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353822 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353838 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353853 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353869 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353885 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353912 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353928 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353944 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353961 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353977 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353994 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354011 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354027 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354044 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354061 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354079 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354096 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354112 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354129 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354148 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354165 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354184 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354200 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354217 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354233 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354253 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354273 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354291 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354308 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354325 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354342 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354360 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354427 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354451 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354472 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354494 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354516 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354539 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354584 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354606 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354627 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354780 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354810 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354836 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354862 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354907 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354935 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354984 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355007 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355056 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355081 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355104 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355153 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355201 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355229 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355256 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355307 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355361 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355388 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355417 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355445 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355468 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355493 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355523 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355548 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355573 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355607 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355681 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355709 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355735 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355760 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355783 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355837 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355863 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355911 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355938 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355969 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355994 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356019 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356069 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356093 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356155 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356191 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356219 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356248 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356273 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxq2d\" (UniqueName: \"kubernetes.io/projected/4d67ab19-ac19-4673-ade0-a35ffb299e85-kube-api-access-zxq2d\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356330 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356412 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d67ab19-ac19-4673-ade0-a35ffb299e85-hosts-file\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356438 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356494 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356584 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356613 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351783 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.351913 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352213 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352229 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352338 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352337 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352428 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352513 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352526 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352584 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352698 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352729 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352751 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352886 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.352938 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353026 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353212 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353212 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353235 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353230 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353301 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353487 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.353947 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354280 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354369 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.354844 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355257 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355502 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355531 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.355796 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356286 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356467 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356554 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.356690 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.357409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.358202 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.358623 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.359092 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.359621 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.359818 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.360030 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.360089 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.357399 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.360407 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.360434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.360720 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.361072 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.361382 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.361655 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.361707 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.361868 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.362128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.362323 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.362406 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.362717 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.363011 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.363838 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.364082 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.364229 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.364511 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365062 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.364022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365092 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365351 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365501 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365683 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.365840 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.366034 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.366225 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.366282 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.367103 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.367452 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.367742 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368215 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368267 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368275 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368623 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368867 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.368917 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369116 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369136 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369301 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369346 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369506 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369587 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369618 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369773 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369808 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.369979 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.371540 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:27.871510351 +0000 UTC m=+21.298761086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.375255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.375705 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.375801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.375953 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.377498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.377965 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.378267 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.378484 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.378724 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379123 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379354 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379420 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379454 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379493 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.379690 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380270 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380365 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380386 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380749 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.380842 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.381317 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.381566 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.382184 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.382285 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.382737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.382866 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.382884 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.383029 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385863 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.383072 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385950 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.383122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384629 4810 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.383850 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384031 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.383554 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384596 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384623 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.384979 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385017 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385275 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385414 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.385524 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.386412 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.386421 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387006 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387020 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387487 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387721 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.387960 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.388247 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.388543 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.388558 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.388735 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.388982 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389082 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389379 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389505 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389909 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.389961 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.390960 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.392505 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.392842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.393205 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.393451 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.394862 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.395200 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.395263 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:27.895245117 +0000 UTC m=+21.322495932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.395424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.395648 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.397293 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.397386 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:27.897365784 +0000 UTC m=+21.324616599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.397639 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.397954 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.398173 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.398720 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.398841 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.399382 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.399587 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.399613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.399831 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400130 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400134 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400315 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400587 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400740 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.400832 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.401080 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.401477 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.401502 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.401533 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.401586 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:27.901565416 +0000 UTC m=+21.328816151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.401590 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.402949 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.404115 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.404128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.404497 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.404672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.405259 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.408440 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.409619 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.409641 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.410487 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.411779 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.411808 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.411828 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.411877 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:27.911861571 +0000 UTC m=+21.339112396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.412716 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.413909 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.415632 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.416511 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.417520 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.417921 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.419380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.420395 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.423274 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.431614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.435734 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.438579 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.440673 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79"} Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.441398 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.447771 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.451324 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.452317 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457517 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxq2d\" (UniqueName: \"kubernetes.io/projected/4d67ab19-ac19-4673-ade0-a35ffb299e85-kube-api-access-zxq2d\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d67ab19-ac19-4673-ade0-a35ffb299e85-hosts-file\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457692 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457768 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457780 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457791 4810 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457800 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457809 4810 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457817 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457826 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457835 4810 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457843 4810 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457851 4810 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457860 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457870 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457878 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457888 4810 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457941 4810 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457950 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.457992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4d67ab19-ac19-4673-ade0-a35ffb299e85-hosts-file\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458017 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458064 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458076 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458089 4810 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458101 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458112 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458121 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458131 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458142 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458152 4810 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458164 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458174 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458184 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458195 4810 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458207 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458217 4810 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458226 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458236 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458247 4810 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458256 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458267 4810 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458277 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458287 4810 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458298 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458308 4810 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458319 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458329 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458339 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458349 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458361 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458371 4810 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458381 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458391 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458403 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458414 4810 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458426 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458438 4810 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458448 4810 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458458 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458468 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458478 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458487 4810 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458503 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458514 4810 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458524 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458535 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458545 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458555 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458565 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458576 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458587 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458599 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458609 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458620 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458631 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458642 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458653 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458664 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458674 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458686 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458695 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458706 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458716 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458725 4810 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458735 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458746 4810 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458756 4810 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458766 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458777 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458787 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458799 4810 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458810 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458820 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458830 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458840 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458850 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458860 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458870 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458880 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458908 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458919 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458929 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458941 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458951 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458961 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458970 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458980 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.458991 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459001 4810 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459012 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459022 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459032 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459042 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459052 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459063 4810 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459073 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459084 4810 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459096 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459105 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459116 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459127 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459137 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459148 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459158 4810 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459167 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459178 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459188 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459198 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459208 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459220 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459230 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459239 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459250 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459260 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459272 4810 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459284 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459295 4810 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459305 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459316 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459325 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459336 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459346 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459358 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459368 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459378 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459387 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459398 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459407 4810 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459417 4810 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459427 4810 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459436 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459471 4810 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459480 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459490 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459500 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459511 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459521 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459533 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459543 4810 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459553 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459563 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.459603 4810 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460003 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460020 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460031 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460050 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460060 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460106 4810 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460119 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460131 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460143 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460155 4810 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460167 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460177 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460189 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460200 4810 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460210 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460220 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460230 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460241 4810 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460251 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460261 4810 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460273 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460290 4810 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460301 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460311 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460322 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460332 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460342 4810 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460353 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460363 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460373 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460384 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460398 4810 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460409 4810 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460419 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460432 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.460445 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.466928 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.481181 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxq2d\" (UniqueName: \"kubernetes.io/projected/4d67ab19-ac19-4673-ade0-a35ffb299e85-kube-api-access-zxq2d\") pod \"node-resolver-d69n4\" (UID: \"4d67ab19-ac19-4673-ade0-a35ffb299e85\") " pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.485479 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.498245 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.507241 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.518682 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.537502 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.548470 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.555098 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.564823 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.565680 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 06:56:27 crc kubenswrapper[4810]: W1003 06:56:27.568071 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-f1116a778ca256c7e1e6be9fd7e507a69629e5a3f7add4d60a1c1d46c8467d0d WatchSource:0}: Error finding container f1116a778ca256c7e1e6be9fd7e507a69629e5a3f7add4d60a1c1d46c8467d0d: Status 404 returned error can't find the container with id f1116a778ca256c7e1e6be9fd7e507a69629e5a3f7add4d60a1c1d46c8467d0d Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.575962 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.576014 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d69n4" Oct 03 06:56:27 crc kubenswrapper[4810]: W1003 06:56:27.579076 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-fd837c3b6c5a3a12f042de944cd12d543ad1a42b8d99cb66348c0ba9f9a5360b WatchSource:0}: Error finding container fd837c3b6c5a3a12f042de944cd12d543ad1a42b8d99cb66348c0ba9f9a5360b: Status 404 returned error can't find the container with id fd837c3b6c5a3a12f042de944cd12d543ad1a42b8d99cb66348c0ba9f9a5360b Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.580805 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 06:56:27 crc kubenswrapper[4810]: W1003 06:56:27.594482 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d67ab19_ac19_4673_ade0_a35ffb299e85.slice/crio-f364cf1b712d5333319fade4de2c9b39b8d72ff113a1c608aab9cb0573f00286 WatchSource:0}: Error finding container f364cf1b712d5333319fade4de2c9b39b8d72ff113a1c608aab9cb0573f00286: Status 404 returned error can't find the container with id f364cf1b712d5333319fade4de2c9b39b8d72ff113a1c608aab9cb0573f00286 Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.604193 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.625205 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.634156 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.646295 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.655663 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.666958 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.679293 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.693530 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.964605 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.964870 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:28.964779588 +0000 UTC m=+22.392030363 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.965249 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.965290 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.965319 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:27 crc kubenswrapper[4810]: I1003 06:56:27.965345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965469 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965531 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965547 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965566 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:28.965543589 +0000 UTC m=+22.392794334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965570 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965568 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965586 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965603 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965637 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:28.965627211 +0000 UTC m=+22.392877946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965648 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965800 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:28.965772545 +0000 UTC m=+22.393023280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:27 crc kubenswrapper[4810]: E1003 06:56:27.965825 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:28.965818496 +0000 UTC m=+22.393069231 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.049957 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8pnks"] Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.050354 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-z8f25"] Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.050729 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6fdnr"] Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.050741 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.051122 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.052944 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whnpx"] Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.053197 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.053479 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.053784 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.053967 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.053968 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.054634 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.055128 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.058674 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.058772 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.058786 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.058763 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.058771 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.059250 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.059268 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.063326 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.063846 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.064011 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.064072 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.064217 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.064473 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.065172 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.072370 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.083819 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.099137 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.115169 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.127802 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.138487 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.148647 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.159652 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167452 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-conf-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167558 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wrr\" (UniqueName: \"kubernetes.io/projected/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-kube-api-access-z6wrr\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167622 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-system-cni-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167734 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-socket-dir-parent\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167808 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167859 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-bin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167885 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-daemon-config\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.167956 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-multus-certs\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-rootfs\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-proxy-tls\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168140 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168160 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-kubelet\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168260 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-netns\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168332 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-cni-binary-copy\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168393 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168410 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-multus\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168484 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cnibin\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168499 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168553 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168568 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168614 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4bdr\" (UniqueName: \"kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168646 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-mcd-auth-proxy-config\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168814 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168835 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168849 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-k8s-cni-cncf-io\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168878 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-os-release\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168931 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9rcz\" (UniqueName: \"kubernetes.io/projected/8a71d33c-dc75-4c28-bda0-0b3793de7de8-kube-api-access-m9rcz\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168961 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168975 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.168989 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169014 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-etc-kubernetes\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-hostroot\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-system-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169101 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-cnibin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-os-release\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169339 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169371 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzk26\" (UniqueName: \"kubernetes.io/projected/f9421086-70f1-441e-9aa0-5ac57a048c89-kube-api-access-bzk26\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.169401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.178864 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.191882 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.222414 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.250879 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-cnibin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270314 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-os-release\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270336 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-system-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270390 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzk26\" (UniqueName: \"kubernetes.io/projected/f9421086-70f1-441e-9aa0-5ac57a048c89-kube-api-access-bzk26\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270410 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270459 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wrr\" (UniqueName: \"kubernetes.io/projected/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-kube-api-access-z6wrr\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270453 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-cnibin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270535 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270545 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-system-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270590 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270586 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270675 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270722 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-conf-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270766 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-conf-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270769 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-system-cni-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-system-cni-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270805 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-socket-dir-parent\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270838 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-bin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270857 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-daemon-config\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-os-release\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270910 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-multus-certs\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-bin\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.270873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-multus-certs\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271000 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271015 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-socket-dir-parent\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271056 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271030 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-kubelet\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271126 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-kubelet\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271134 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271163 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-rootfs\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-rootfs\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271191 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-proxy-tls\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271213 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-netns\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271315 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-cni-binary-copy\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271320 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-netns\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271329 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-multus\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271434 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271456 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271487 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4bdr\" (UniqueName: \"kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cnibin\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271555 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271600 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-k8s-cni-cncf-io\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271642 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-mcd-auth-proxy-config\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271661 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271682 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271701 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271767 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271802 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-os-release\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9rcz\" (UniqueName: \"kubernetes.io/projected/8a71d33c-dc75-4c28-bda0-0b3793de7de8-kube-api-access-m9rcz\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271851 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-hostroot\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-etc-kubernetes\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271926 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271968 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-etc-kubernetes\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.271346 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-var-lib-cni-multus\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272160 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272167 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-cni-dir\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272227 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272265 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272293 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272321 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272353 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-cni-binary-copy\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272497 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-cnibin\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-mcd-auth-proxy-config\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272566 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-host-run-k8s-cni-cncf-io\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-os-release\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272660 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272682 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272704 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.272707 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9421086-70f1-441e-9aa0-5ac57a048c89-hostroot\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.273082 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9421086-70f1-441e-9aa0-5ac57a048c89-multus-daemon-config\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.273100 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.273197 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a71d33c-dc75-4c28-bda0-0b3793de7de8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.273339 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.273591 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.276729 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.277180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-proxy-tls\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.292371 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9rcz\" (UniqueName: \"kubernetes.io/projected/8a71d33c-dc75-4c28-bda0-0b3793de7de8-kube-api-access-m9rcz\") pod \"multus-additional-cni-plugins-6fdnr\" (UID: \"8a71d33c-dc75-4c28-bda0-0b3793de7de8\") " pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.296364 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzk26\" (UniqueName: \"kubernetes.io/projected/f9421086-70f1-441e-9aa0-5ac57a048c89-kube-api-access-bzk26\") pod \"multus-8pnks\" (UID: \"f9421086-70f1-441e-9aa0-5ac57a048c89\") " pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.297005 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4bdr\" (UniqueName: \"kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr\") pod \"ovnkube-node-whnpx\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.301306 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wrr\" (UniqueName: \"kubernetes.io/projected/e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930-kube-api-access-z6wrr\") pod \"machine-config-daemon-z8f25\" (UID: \"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\") " pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.308272 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.321380 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.343752 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.361096 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.370592 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pnks" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.378669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.378772 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.384662 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.395771 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:28 crc kubenswrapper[4810]: W1003 06:56:28.407704 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a71d33c_dc75_4c28_bda0_0b3793de7de8.slice/crio-1309cfbf1dd50127be6d288afb9006df1a55fbc077c240f0aeb645f6b4397d02 WatchSource:0}: Error finding container 1309cfbf1dd50127be6d288afb9006df1a55fbc077c240f0aeb645f6b4397d02: Status 404 returned error can't find the container with id 1309cfbf1dd50127be6d288afb9006df1a55fbc077c240f0aeb645f6b4397d02 Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.412812 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: W1003 06:56:28.418531 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12d3cfb_2ba7_4eb6_b6b4_bfc4cec93930.slice/crio-240f730013531a89b1e90141feaea8335a313aed120d3b3286e578fd82ebd2b8 WatchSource:0}: Error finding container 240f730013531a89b1e90141feaea8335a313aed120d3b3286e578fd82ebd2b8: Status 404 returned error can't find the container with id 240f730013531a89b1e90141feaea8335a313aed120d3b3286e578fd82ebd2b8 Oct 03 06:56:28 crc kubenswrapper[4810]: W1003 06:56:28.431339 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88c6d2ac_d97b_43a1_8bf7_3cc367fe108e.slice/crio-6fe74cdfc6724198f6aa4e4428dc76ed20636bcf9b1e243a03d534e69eda887a WatchSource:0}: Error finding container 6fe74cdfc6724198f6aa4e4428dc76ed20636bcf9b1e243a03d534e69eda887a: Status 404 returned error can't find the container with id 6fe74cdfc6724198f6aa4e4428dc76ed20636bcf9b1e243a03d534e69eda887a Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.441610 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.452471 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"240f730013531a89b1e90141feaea8335a313aed120d3b3286e578fd82ebd2b8"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.455456 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerStarted","Data":"1309cfbf1dd50127be6d288afb9006df1a55fbc077c240f0aeb645f6b4397d02"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.457573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerStarted","Data":"1b56ae95970bbf98891eb668e54766ba2ddeea15139c67a579e39701ed1bcf48"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.470835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.470882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f1116a778ca256c7e1e6be9fd7e507a69629e5a3f7add4d60a1c1d46c8467d0d"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.472714 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.477725 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d69n4" event={"ID":"4d67ab19-ac19-4673-ade0-a35ffb299e85","Type":"ContainerStarted","Data":"9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.477782 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d69n4" event={"ID":"4d67ab19-ac19-4673-ade0-a35ffb299e85","Type":"ContainerStarted","Data":"f364cf1b712d5333319fade4de2c9b39b8d72ff113a1c608aab9cb0573f00286"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.479990 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd837c3b6c5a3a12f042de944cd12d543ad1a42b8d99cb66348c0ba9f9a5360b"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.487780 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.487827 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.487880 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e9505249bd2804c993dc8f5d966c83987d88b773a1112a7cd8bd0d41a3c77383"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.491445 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.500109 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"6fe74cdfc6724198f6aa4e4428dc76ed20636bcf9b1e243a03d534e69eda887a"} Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.504763 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.517160 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.530754 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.548879 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.568875 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.581701 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.601208 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.616274 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.629621 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.642151 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.655525 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.690992 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.735365 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:28Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.980027 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.980161 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980201 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:30.980172551 +0000 UTC m=+24.407423286 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.980236 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980248 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.980273 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980290 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:30.980280234 +0000 UTC m=+24.407530969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: I1003 06:56:28.980304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980385 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980402 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980424 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980402 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980429 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:30.980419768 +0000 UTC m=+24.407670503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980451 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980460 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980486 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:30.98047861 +0000 UTC m=+24.407729345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980439 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:28 crc kubenswrapper[4810]: E1003 06:56:28.980518 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:30.980511211 +0000 UTC m=+24.407761946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.302007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.302049 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.302024 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:29 crc kubenswrapper[4810]: E1003 06:56:29.302161 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:29 crc kubenswrapper[4810]: E1003 06:56:29.302216 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:29 crc kubenswrapper[4810]: E1003 06:56:29.302333 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.306323 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.307463 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.308967 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.309810 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.311096 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.311708 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.312407 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.313617 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.314500 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.315575 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.316311 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.317942 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.318602 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.319242 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.320375 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.321102 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.322352 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.322777 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.323719 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.324985 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.325761 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.326913 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.327454 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.328815 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.329242 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.329872 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.334095 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.334722 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.335853 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.336471 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.337477 4810 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.337591 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.339239 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.340197 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.340683 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.342259 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.342964 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.343853 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.344514 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.345632 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.346227 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.347187 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.348186 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.348815 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.349281 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.350208 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.351052 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.351936 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.352450 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.353371 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.353882 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.354918 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.355489 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.355998 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.503675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerStarted","Data":"22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06"} Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.505205 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" exitCode=0 Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.505289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.508098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e"} Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.508144 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6"} Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.509838 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455" exitCode=0 Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.509912 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455"} Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.530536 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.550251 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.562265 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.576019 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.589925 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.606351 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.618601 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.636524 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.651915 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.667059 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.687426 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.699390 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.712379 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.723584 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.736342 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.750616 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.763070 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.775143 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.786242 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.797430 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.808754 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.823273 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.848908 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.870503 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.886572 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.901078 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.948377 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vnb5p"] Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.948931 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.951648 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.951700 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.952480 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.953804 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.966264 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.982723 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:29 crc kubenswrapper[4810]: I1003 06:56:29.998764 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:29Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.015738 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.049301 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.088131 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.092534 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-host\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.092580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxvrz\" (UniqueName: \"kubernetes.io/projected/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-kube-api-access-sxvrz\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.092603 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-serviceca\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.128969 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.169313 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.193806 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-serviceca\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.195003 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-host\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.195068 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxvrz\" (UniqueName: \"kubernetes.io/projected/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-kube-api-access-sxvrz\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.195331 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-host\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.195599 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-serviceca\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.213692 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.239399 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxvrz\" (UniqueName: \"kubernetes.io/projected/59e2d65e-f3c1-4b45-8c19-b033bd5f3aac-kube-api-access-sxvrz\") pod \"node-ca-vnb5p\" (UID: \"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\") " pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.271988 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.309972 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.337204 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vnb5p" Oct 03 06:56:30 crc kubenswrapper[4810]: W1003 06:56:30.350236 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e2d65e_f3c1_4b45_8c19_b033bd5f3aac.slice/crio-23d59e1c4d5032f7e70f217611738c31404a4738c1007ca6b503eaf151ef6de9 WatchSource:0}: Error finding container 23d59e1c4d5032f7e70f217611738c31404a4738c1007ca6b503eaf151ef6de9: Status 404 returned error can't find the container with id 23d59e1c4d5032f7e70f217611738c31404a4738c1007ca6b503eaf151ef6de9 Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.357773 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.410017 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.469307 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.477808 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.492226 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.495141 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.496316 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.516484 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerStarted","Data":"12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.518706 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.526724 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.526783 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.526798 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.526813 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.529177 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnb5p" event={"ID":"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac","Type":"ContainerStarted","Data":"23d59e1c4d5032f7e70f217611738c31404a4738c1007ca6b503eaf151ef6de9"} Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.535637 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: E1003 06:56:30.548881 4810 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.591581 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.631155 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.672065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.716526 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.749603 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.789845 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.828566 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.874231 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.912725 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.949185 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:30 crc kubenswrapper[4810]: I1003 06:56:30.994242 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:30Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.002161 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.002331 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002345 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:35.002320588 +0000 UTC m=+28.429571343 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002498 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002519 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002533 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002592 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:35.002578244 +0000 UTC m=+28.429828989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.002638 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002782 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002846 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:35.002832261 +0000 UTC m=+28.430083006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.002869 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.002870 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.003069 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:35.003054867 +0000 UTC m=+28.430305612 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.002951 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.003236 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.003259 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.003273 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.003326 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:35.003312353 +0000 UTC m=+28.430563098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.033242 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.079944 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.118098 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.155023 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.194226 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.237284 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.279381 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.301688 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.301734 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.301747 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.301968 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.302093 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:31 crc kubenswrapper[4810]: E1003 06:56:31.302139 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.314718 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [cluster-policy-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.359552 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.403449 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.432415 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.470760 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.510740 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.534462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vnb5p" event={"ID":"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac","Type":"ContainerStarted","Data":"bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a"} Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.538232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.538318 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.540219 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8" exitCode=0 Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.540342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8"} Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.541928 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.545778 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.553302 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.590652 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.633626 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.671975 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.709774 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.750664 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.790636 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.829719 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.874142 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.912390 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.953782 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:31 crc kubenswrapper[4810]: I1003 06:56:31.996793 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:31Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.030812 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.075856 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.114553 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.155471 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.191863 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.249333 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.546716 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe" exitCode=0 Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.546810 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.579049 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.595848 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.612235 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.648632 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.665360 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.681424 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.705759 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.721810 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.734698 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.744506 4810 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.746335 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.746359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.746368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.746474 4810 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.752293 4810 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.752603 4810 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.753660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.753815 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.753928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.754035 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.754143 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.756922 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.768347 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.770746 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.771922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.771959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.771970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.771984 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.771995 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.782466 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.785473 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.789926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.789970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.789982 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.790003 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.790017 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.795918 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.805581 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.810425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.810464 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.810473 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.810486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.810497 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.823367 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.827703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.827771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.827787 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.827809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.827828 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.832094 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.844057 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: E1003 06:56:32.844280 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.846405 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.846450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.846460 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.846480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.846492 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.870524 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:32Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.948704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.948746 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.948755 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.948769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:32 crc kubenswrapper[4810]: I1003 06:56:32.948779 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:32Z","lastTransitionTime":"2025-10-03T06:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.051246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.051306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.051333 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.051352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.051363 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.154353 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.154394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.154403 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.154423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.154433 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.258771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.258854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.258877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.258934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.258968 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.301792 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.301993 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:33 crc kubenswrapper[4810]: E1003 06:56:33.302143 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.302191 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:33 crc kubenswrapper[4810]: E1003 06:56:33.302387 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:33 crc kubenswrapper[4810]: E1003 06:56:33.302580 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.361412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.361443 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.361453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.361466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.361475 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.465356 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.465417 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.465437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.465461 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.465478 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.556516 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.559442 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488" exitCode=0 Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.559487 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.568500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.568559 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.568579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.568602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.568621 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.580404 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.604377 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.622725 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.642651 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.670744 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.673310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.673376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.673396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.673424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.673445 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.688095 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.703296 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.720053 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.735974 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.752955 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.776675 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.776717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.776730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.776747 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.776761 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.782380 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.805959 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.825786 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.839014 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.851943 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:33Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.879659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.879712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.879730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.879749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.879763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.982866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.982980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.983006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.983038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:33 crc kubenswrapper[4810]: I1003 06:56:33.983062 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:33Z","lastTransitionTime":"2025-10-03T06:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.086366 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.086418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.086434 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.086456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.086473 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.189657 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.189710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.189722 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.189740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.189754 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.293397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.293450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.293465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.293485 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.293501 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.396641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.396718 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.396734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.396758 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.396775 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.502671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.502722 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.502739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.504147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.504190 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.566157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerStarted","Data":"c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.590973 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.607715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.607794 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.607818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.607848 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.607872 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.609393 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.634377 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.668059 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.687933 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.706666 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.710428 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.710462 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.710471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.710487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.710499 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.721646 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.734084 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.746422 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.757760 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.771540 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.785828 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.800108 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813347 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813400 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.813716 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.841778 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.916499 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.916544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.916555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.916571 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:34 crc kubenswrapper[4810]: I1003 06:56:34.916583 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:34Z","lastTransitionTime":"2025-10-03T06:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.019593 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.019664 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.019688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.019720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.019742 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.047657 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.047820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.047948 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.047877802 +0000 UTC m=+36.475128567 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.047958 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.048009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048035 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.048012276 +0000 UTC m=+36.475263041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.048061 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.048101 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048226 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048279 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.048261332 +0000 UTC m=+36.475512097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048348 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048396 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048408 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048453 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048467 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048529 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.048508129 +0000 UTC m=+36.475758944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048425 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.048664 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.048639742 +0000 UTC m=+36.475890587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.121980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.122019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.122032 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.122050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.122063 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.225441 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.225978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.225996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.226019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.226039 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.302333 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.302387 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.302419 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.302512 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.302709 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:35 crc kubenswrapper[4810]: E1003 06:56:35.302973 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.329663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.329702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.329715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.329737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.329753 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.433931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.433976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.433990 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.434008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.434020 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.537469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.537525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.537538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.537555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.537568 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.575864 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415" exitCode=0 Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.575969 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.590660 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.591267 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.591339 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.599565 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.625995 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.628484 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.629264 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640478 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640491 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640510 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640522 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.640547 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.650756 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.662916 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.678607 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.697678 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.713219 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.729000 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.742182 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.743169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.743185 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.743193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.743206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.743215 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.755743 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.770372 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.784298 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.797685 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.816491 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.828813 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.846158 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.846205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.846217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.846235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.846248 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.858171 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.871673 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.884681 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.901797 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.914917 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.936340 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.947627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.947652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.947660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.947672 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.947682 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:35Z","lastTransitionTime":"2025-10-03T06:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.955539 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:35 crc kubenswrapper[4810]: I1003 06:56:35.985263 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:35Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.007683 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.022390 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.034432 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.049016 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.051204 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.051261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.051277 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.051297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.051315 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.067448 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.085746 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.153977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.154044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.154062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.154084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.154101 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.257375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.257461 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.257486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.257519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.257544 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.360007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.360045 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.360055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.360068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.360077 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.462778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.462838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.462852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.462873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.462888 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.565829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.565880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.565922 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.565942 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.565956 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.597171 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a71d33c-dc75-4c28-bda0-0b3793de7de8" containerID="b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89" exitCode=0 Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.597258 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerDied","Data":"b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.597343 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.611148 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.635677 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.657226 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.668632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.668672 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.668684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.668702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.668715 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.671935 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.686712 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.700712 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.717090 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.730479 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.752581 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771283 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.771780 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.785555 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.799885 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.821780 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.837698 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.858263 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:36Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.874498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.874535 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.874546 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.874561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.874572 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.977031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.977095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.977106 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.977119 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:36 crc kubenswrapper[4810]: I1003 06:56:36.977130 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:36Z","lastTransitionTime":"2025-10-03T06:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.079479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.079529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.079541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.079563 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.079579 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.182573 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.182627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.182639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.182657 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.182670 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.288359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.288797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.288824 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.288851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.288885 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.301573 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.301687 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.301597 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:37 crc kubenswrapper[4810]: E1003 06:56:37.301802 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:37 crc kubenswrapper[4810]: E1003 06:56:37.301988 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:37 crc kubenswrapper[4810]: E1003 06:56:37.302062 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.328864 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.345644 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.358673 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.371704 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.384804 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.392379 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.392421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.392434 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.392451 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.392464 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.398471 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.413925 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.428957 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.449072 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.463650 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.479952 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.492547 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.498593 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.498654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.498665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.498683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.498693 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.503313 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.514831 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.530606 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.600619 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.600660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.600669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.600682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.600694 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.605185 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.606517 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" event={"ID":"8a71d33c-dc75-4c28-bda0-0b3793de7de8","Type":"ContainerStarted","Data":"dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.622222 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.649947 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.680818 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.697386 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.703102 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.703151 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.703163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.703183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.703195 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.710346 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.722559 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.732325 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.743005 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.752546 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.762979 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.778507 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.789216 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.798426 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.805975 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.806027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.806040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.806067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.806081 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.810110 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.821235 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.908951 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.908988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.908997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.909011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:37 crc kubenswrapper[4810]: I1003 06:56:37.909022 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:37Z","lastTransitionTime":"2025-10-03T06:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.011542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.011582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.011591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.011605 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.011615 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.114349 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.114404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.114414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.114427 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.114462 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.216620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.216683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.216701 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.216725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.216746 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.319620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.319674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.319686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.319704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.319716 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.421403 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.421440 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.421449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.421462 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.421472 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.524228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.524270 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.524283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.524304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.524323 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.610739 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/0.log" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.614050 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e" exitCode=1 Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.614112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.615050 4810 scope.go:117] "RemoveContainer" containerID="45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.628533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.628565 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.628574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.628589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.628599 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.634232 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.649336 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.671740 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.686917 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.700203 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.716667 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.728663 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.730705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.730740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.730751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.730764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.730774 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.746545 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.758108 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.772657 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.795825 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:38Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:141\\\\nI1003 06:56:38.353240 6074 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 06:56:38.353692 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 06:56:38.353711 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 06:56:38.353722 6074 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 06:56:38.353727 6074 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 06:56:38.353748 6074 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 06:56:38.353758 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 06:56:38.353764 6074 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 06:56:38.353772 6074 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 06:56:38.354126 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 06:56:38.354165 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 06:56:38.354202 6074 factory.go:656] Stopping watch factory\\\\nI1003 06:56:38.354207 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 06:56:38.354223 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1003 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.812752 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.826520 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.833482 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.833566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.833601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.833654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.833678 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.838595 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.853423 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:38Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.936426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.936489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.936507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.936532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:38 crc kubenswrapper[4810]: I1003 06:56:38.936550 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:38Z","lastTransitionTime":"2025-10-03T06:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.010377 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.039782 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.039868 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.039944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.039997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.040047 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.142425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.142488 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.142507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.142533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.142554 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.245949 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.246037 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.246057 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.246111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.246132 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.302025 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.302128 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:39 crc kubenswrapper[4810]: E1003 06:56:39.302200 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.302151 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:39 crc kubenswrapper[4810]: E1003 06:56:39.302401 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:39 crc kubenswrapper[4810]: E1003 06:56:39.302537 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.348745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.348823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.348841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.348866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.348885 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.452991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.453054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.453067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.453092 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.453105 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.555544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.555597 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.555612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.555630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.555644 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.619771 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/0.log" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.622281 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.622712 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.645188 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.657740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.657773 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.657784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.657797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.657807 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.660737 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.676936 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.693835 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.708356 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.724203 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.739881 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.759905 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.759939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.759949 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.759964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.759975 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.765933 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.785809 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:38Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:141\\\\nI1003 06:56:38.353240 6074 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 06:56:38.353692 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 06:56:38.353711 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 06:56:38.353722 6074 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 06:56:38.353727 6074 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 06:56:38.353748 6074 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 06:56:38.353758 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 06:56:38.353764 6074 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 06:56:38.353772 6074 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 06:56:38.354126 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 06:56:38.354165 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 06:56:38.354202 6074 factory.go:656] Stopping watch factory\\\\nI1003 06:56:38.354207 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 06:56:38.354223 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1003 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.805373 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.819115 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.833823 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.849468 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.862959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.863016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.863031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.863054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.863070 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.864074 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.879441 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:39Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.966318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.966795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.966829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.966855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:39 crc kubenswrapper[4810]: I1003 06:56:39.966872 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:39Z","lastTransitionTime":"2025-10-03T06:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.070365 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.070434 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.070467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.070497 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.070524 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.174708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.174775 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.174792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.174817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.174835 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.278205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.278264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.278283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.278306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.278324 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.382798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.382865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.382876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.382914 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.382927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.486068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.486126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.486140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.486165 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.486182 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.521174 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk"] Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.521775 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.525307 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.525748 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.561143 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.577210 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.589468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.589499 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.589507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.589543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.589554 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.590741 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.606274 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.608662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.608730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz4ct\" (UniqueName: \"kubernetes.io/projected/44db27b9-37f2-443a-8c72-3109af8b80bf-kube-api-access-mz4ct\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.608842 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44db27b9-37f2-443a-8c72-3109af8b80bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.608968 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.620594 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.628470 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/1.log" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.629314 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/0.log" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.632399 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b" exitCode=1 Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.632447 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.632524 4810 scope.go:117] "RemoveContainer" containerID="45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.633209 4810 scope.go:117] "RemoveContainer" containerID="c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b" Oct 03 06:56:40 crc kubenswrapper[4810]: E1003 06:56:40.633403 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.641361 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.656577 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.674614 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.692737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.692788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.692803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.692825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.692841 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.693382 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:38Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:141\\\\nI1003 06:56:38.353240 6074 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 06:56:38.353692 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 06:56:38.353711 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 06:56:38.353722 6074 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 06:56:38.353727 6074 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 06:56:38.353748 6074 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 06:56:38.353758 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 06:56:38.353764 6074 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 06:56:38.353772 6074 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 06:56:38.354126 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 06:56:38.354165 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 06:56:38.354202 6074 factory.go:656] Stopping watch factory\\\\nI1003 06:56:38.354207 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 06:56:38.354223 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1003 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.706089 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.709521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz4ct\" (UniqueName: \"kubernetes.io/projected/44db27b9-37f2-443a-8c72-3109af8b80bf-kube-api-access-mz4ct\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.709637 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44db27b9-37f2-443a-8c72-3109af8b80bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.709681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.709728 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.710303 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.710491 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/44db27b9-37f2-443a-8c72-3109af8b80bf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.718597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/44db27b9-37f2-443a-8c72-3109af8b80bf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.722212 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.727512 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz4ct\" (UniqueName: \"kubernetes.io/projected/44db27b9-37f2-443a-8c72-3109af8b80bf-kube-api-access-mz4ct\") pod \"ovnkube-control-plane-749d76644c-pz9mk\" (UID: \"44db27b9-37f2-443a-8c72-3109af8b80bf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.737634 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.749769 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.762249 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.776625 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.795967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.796029 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.796045 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.796068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.796080 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.797806 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.810542 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.828737 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.836652 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.843533 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: W1003 06:56:40.853434 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44db27b9_37f2_443a_8c72_3109af8b80bf.slice/crio-77ee332dd2f09bc054d8d152c29ac6ba59608c82f7ebb52f206a13869b0b896b WatchSource:0}: Error finding container 77ee332dd2f09bc054d8d152c29ac6ba59608c82f7ebb52f206a13869b0b896b: Status 404 returned error can't find the container with id 77ee332dd2f09bc054d8d152c29ac6ba59608c82f7ebb52f206a13869b0b896b Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.857259 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.872508 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.890089 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.898458 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.898494 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.898505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.898524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.898536 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:40Z","lastTransitionTime":"2025-10-03T06:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.905475 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.934748 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.955378 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.973227 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:40 crc kubenswrapper[4810]: I1003 06:56:40.996964 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45ecca34a913ac32447cf6e7f1d603d5faedbb0014ac19a2a0689d1ece58541e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:38Z\\\",\\\"message\\\":\\\"mers/externalversions/factory.go:141\\\\nI1003 06:56:38.353240 6074 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 06:56:38.353692 6074 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 06:56:38.353711 6074 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 06:56:38.353722 6074 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 06:56:38.353727 6074 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 06:56:38.353748 6074 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 06:56:38.353758 6074 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 06:56:38.353764 6074 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 06:56:38.353772 6074 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 06:56:38.354126 6074 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 06:56:38.354165 6074 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 06:56:38.354202 6074 factory.go:656] Stopping watch factory\\\\nI1003 06:56:38.354207 6074 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 06:56:38.354223 6074 ovnkube.go:599] Stopped ovnkube\\\\nI1003 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:40Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.000694 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.000740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.000750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.000772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.000782 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.009717 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.024055 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.038510 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.054411 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.071503 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.103321 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.103371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.103380 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.103395 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.103408 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.207034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.207088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.207100 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.207123 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.207137 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.302613 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.302658 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.302856 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.303009 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.303445 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.303935 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.311676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.311731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.311747 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.311768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.311785 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.414814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.414877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.414912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.414936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.414952 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.517570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.517627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.517639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.517659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.517674 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.620480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.620554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.620570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.620594 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.620612 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.638550 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/1.log" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.643237 4810 scope.go:117] "RemoveContainer" containerID="c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.643520 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.643878 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" event={"ID":"44db27b9-37f2-443a-8c72-3109af8b80bf","Type":"ContainerStarted","Data":"b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.643957 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" event={"ID":"44db27b9-37f2-443a-8c72-3109af8b80bf","Type":"ContainerStarted","Data":"249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.643973 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" event={"ID":"44db27b9-37f2-443a-8c72-3109af8b80bf","Type":"ContainerStarted","Data":"77ee332dd2f09bc054d8d152c29ac6ba59608c82f7ebb52f206a13869b0b896b"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.658389 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.666175 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-drrxm"] Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.666782 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.666872 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.676554 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.690424 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.705251 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.718681 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.718775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgs4v\" (UniqueName: \"kubernetes.io/projected/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-kube-api-access-bgs4v\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.723835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.723872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.723882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.723908 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.723918 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.730092 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.742960 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.756926 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.771978 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.786225 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.802416 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.817588 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.820061 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.820122 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs4v\" (UniqueName: \"kubernetes.io/projected/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-kube-api-access-bgs4v\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.820291 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:41 crc kubenswrapper[4810]: E1003 06:56:41.820389 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:42.320365763 +0000 UTC m=+35.747616498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.826250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.826293 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.826302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.826317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.826330 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.837657 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs4v\" (UniqueName: \"kubernetes.io/projected/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-kube-api-access-bgs4v\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.839387 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.853432 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.864580 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.878052 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.889842 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.901971 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.915350 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.929017 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.929060 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.929070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.929085 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.929096 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:41Z","lastTransitionTime":"2025-10-03T06:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.933343 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.947932 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.962597 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.982465 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:41 crc kubenswrapper[4810]: I1003 06:56:41.995688 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:41Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.009001 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.033160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.033219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.033235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.033259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.033275 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.035695 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.053176 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.068367 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.083875 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.099574 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.124979 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.137423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.137513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.137538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.137576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.137603 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.148926 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.168632 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.185538 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.240827 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.240878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.240927 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.240956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.240972 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.327024 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.327264 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.327375 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:43.327352701 +0000 UTC m=+36.754603456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.343598 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.343641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.343652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.343668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.343681 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.447343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.447388 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.447398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.447415 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.447453 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.550304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.550351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.550364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.550383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.550403 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.653287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.653335 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.653344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.653362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.653373 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.756656 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.756707 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.756720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.756741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.756754 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.859973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.860055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.860074 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.860109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.860129 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.865731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.865800 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.865820 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.865842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.865856 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.880328 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.885489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.885564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.885582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.885638 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.885657 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.905377 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.911460 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.911508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.911518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.911538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.911550 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.927997 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.933576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.933636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.933651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.933674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.933689 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.950709 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.955581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.955633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.955649 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.955667 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.955682 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.970915 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:42Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:42 crc kubenswrapper[4810]: E1003 06:56:42.971091 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.973005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.973049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.973061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.973116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:42 crc kubenswrapper[4810]: I1003 06:56:42.973135 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:42Z","lastTransitionTime":"2025-10-03T06:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.076125 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.076196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.076214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.076239 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.076260 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.135299 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.135476 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.135614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.135662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.135709 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.135728 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.135855 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:59.135818616 +0000 UTC m=+52.563069391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.135947 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.135947 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.135978 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136246 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136339 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136367 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136264 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136154 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:59.136115145 +0000 UTC m=+52.563365910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136555 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:56:59.136517975 +0000 UTC m=+52.563768750 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136584 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:59.136569787 +0000 UTC m=+52.563820562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.136604 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:59.136593747 +0000 UTC m=+52.563844522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.179528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.179575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.179591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.179617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.179636 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.282806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.282857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.282866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.282884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.282917 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.301538 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.301596 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.301565 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.301808 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.301802 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.301952 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.302114 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.302427 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.337704 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.338020 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: E1003 06:56:43.338155 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:45.33811636 +0000 UTC m=+38.765367265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.386459 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.386514 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.386531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.386555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.386572 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.490119 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.490187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.490205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.490231 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.490255 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.593669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.593734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.593751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.593772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.593785 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.696711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.697191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.697210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.697237 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.697279 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.800057 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.800109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.800120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.800139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.800153 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.903991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.904072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.904114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.904154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:43 crc kubenswrapper[4810]: I1003 06:56:43.904180 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:43Z","lastTransitionTime":"2025-10-03T06:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.008625 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.008699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.008717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.008744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.008764 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.112367 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.112436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.112453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.112477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.112493 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.215398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.215470 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.215493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.215530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.215554 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.318532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.318642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.318662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.318693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.318715 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.421938 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.421996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.422008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.422027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.422042 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.526055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.526508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.526660 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.526842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.527042 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.630882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.630962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.630973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.630998 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.631015 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.733388 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.733469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.733493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.733525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.733549 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.756323 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.780411 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.800238 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.816743 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.835834 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.836655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.836704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.836723 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.836747 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.836767 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.854489 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.877192 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.901559 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.925528 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.940484 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.940553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.940574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.940600 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.940622 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:44Z","lastTransitionTime":"2025-10-03T06:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.943260 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.974121 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:44 crc kubenswrapper[4810]: I1003 06:56:44.989106 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.005014 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.026658 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.041402 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.043532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.043655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.043678 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.043708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.043729 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.055587 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.072467 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.091416 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.147220 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.147312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.147335 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.147368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.147397 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.249575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.249641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.249658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.249697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.249714 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.302519 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.302668 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.303037 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.303083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.303123 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.303177 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.303340 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.303607 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.352630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.352723 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.352748 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.352778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.352802 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.369312 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.369478 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:45 crc kubenswrapper[4810]: E1003 06:56:45.369562 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:49.369542094 +0000 UTC m=+42.796792829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.456822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.456879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.456903 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.456923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.456938 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.559994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.560062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.560078 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.560100 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.560115 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.663352 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.663399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.663410 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.663426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.663436 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.766348 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.766389 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.766397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.766410 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.766420 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.869948 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.870044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.870063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.870088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.870108 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.973050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.973108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.973119 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.973140 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:45 crc kubenswrapper[4810]: I1003 06:56:45.973152 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:45Z","lastTransitionTime":"2025-10-03T06:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.076165 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.076211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.076223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.076240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.076252 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.179772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.179851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.179873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.179934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.179956 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.282991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.283049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.283067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.283090 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.283110 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.385878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.385966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.385985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.386011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.386030 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.489284 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.489383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.489402 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.489432 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.489453 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.593104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.593177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.593198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.593229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.593251 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.697183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.697260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.697280 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.697305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.697327 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.800397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.800463 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.800487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.800512 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.800529 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.904015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.904124 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.904147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.904215 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:46 crc kubenswrapper[4810]: I1003 06:56:46.904235 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:46Z","lastTransitionTime":"2025-10-03T06:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.007218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.007278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.007297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.007325 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.007356 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.114867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.114995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.115020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.115056 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.115199 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.219138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.219205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.219229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.219267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.219291 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.302280 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.302478 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.302732 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:47 crc kubenswrapper[4810]: E1003 06:56:47.302713 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.302841 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:47 crc kubenswrapper[4810]: E1003 06:56:47.302989 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:47 crc kubenswrapper[4810]: E1003 06:56:47.303130 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:47 crc kubenswrapper[4810]: E1003 06:56:47.303386 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.320859 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.322930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.322991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.323010 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.323039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.323060 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.340252 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.363094 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.381721 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.411100 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.426628 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.426700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.426720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.426753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.426771 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.435940 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.456402 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.475886 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.493173 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.514672 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.530818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.530929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.530961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.530992 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.531016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.533332 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.566491 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.584518 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.606425 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.634437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.634481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.634493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.634515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.634527 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.635265 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.655170 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.675455 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.736651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.736713 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.736733 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.736758 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.736775 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.839866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.839961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.839974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.839999 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.840018 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.943764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.943817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.943826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.943843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:47 crc kubenswrapper[4810]: I1003 06:56:47.943856 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:47Z","lastTransitionTime":"2025-10-03T06:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.047070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.047121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.047134 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.047156 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.047169 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.150854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.150960 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.150991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.151019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.151046 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.254418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.254495 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.254518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.254553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.254579 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.357792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.357877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.357944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.357988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.358014 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.460571 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.460649 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.460662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.460682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.460694 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.563477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.563541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.563559 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.563586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.563603 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.667248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.667323 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.667350 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.667376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.667395 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.770341 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.770415 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.770433 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.770456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.770473 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.874073 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.875111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.875156 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.875187 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.875207 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.978846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.978970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.978998 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.979028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:48 crc kubenswrapper[4810]: I1003 06:56:48.979051 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:48Z","lastTransitionTime":"2025-10-03T06:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.083121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.083198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.083224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.083250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.083269 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.186539 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.186634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.186652 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.186681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.186705 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.290678 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.290728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.290737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.290754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.290765 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.302325 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.302375 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.302402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.302595 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.302734 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.302849 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.302994 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.303142 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.393727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.393779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.393790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.393806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.393819 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.416301 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.416532 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:49 crc kubenswrapper[4810]: E1003 06:56:49.416666 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:56:57.41663737 +0000 UTC m=+50.843888135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.496435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.496508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.496521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.496547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.496561 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.600390 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.600465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.600480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.600501 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.600516 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.703462 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.703543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.703562 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.703591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.703610 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.806828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.806960 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.806973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.806989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.807001 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.909659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.909740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.909764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.909796 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:49 crc kubenswrapper[4810]: I1003 06:56:49.909821 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:49Z","lastTransitionTime":"2025-10-03T06:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.013207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.013271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.013291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.013316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.013333 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.116967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.117096 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.117116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.117145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.117163 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.221888 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.221972 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.221981 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.222006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.222017 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.325921 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.325995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.326008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.326034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.326049 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.429098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.429148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.429160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.429183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.429195 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.532336 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.532386 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.532398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.532419 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.532434 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.635602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.635709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.635731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.635763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.635783 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.738689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.738809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.738873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.738932 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.738959 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.842804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.842880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.842926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.842962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.842985 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.946384 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.946449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.946466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.946489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:50 crc kubenswrapper[4810]: I1003 06:56:50.946511 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:50Z","lastTransitionTime":"2025-10-03T06:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.050523 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.050584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.050643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.050673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.050729 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.154067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.154154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.154173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.154201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.154222 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.258591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.258674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.258693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.258724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.258778 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.302538 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.302653 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.302733 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:51 crc kubenswrapper[4810]: E1003 06:56:51.302970 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.303051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:51 crc kubenswrapper[4810]: E1003 06:56:51.303213 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:51 crc kubenswrapper[4810]: E1003 06:56:51.303324 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:51 crc kubenswrapper[4810]: E1003 06:56:51.303406 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.361394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.361443 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.361454 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.361472 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.361487 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.464674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.464720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.464730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.464750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.464762 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.567681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.567730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.567742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.567763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.567776 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.671344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.671382 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.671411 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.671429 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.671438 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.775640 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.775734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.775757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.775790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.775821 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.879201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.879252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.879269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.879294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.879310 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.982867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.982995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.983015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.983040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:51 crc kubenswrapper[4810]: I1003 06:56:51.983061 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:51Z","lastTransitionTime":"2025-10-03T06:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.086094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.086173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.086193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.086222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.086245 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.196493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.197304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.197342 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.197379 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.197403 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.301689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.301782 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.301802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.301832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.301859 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.404733 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.404809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.404832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.404861 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.404881 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.508054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.508114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.508127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.508145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.508158 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.611332 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.611401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.611419 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.611449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.611469 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.714779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.714849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.714866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.714920 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.714939 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.818467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.818541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.818551 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.818569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.818579 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.922258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.922327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.922345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.922370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:52 crc kubenswrapper[4810]: I1003 06:56:52.922389 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:52Z","lastTransitionTime":"2025-10-03T06:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.025596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.025643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.025661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.025686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.025698 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.121088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.121144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.121155 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.121175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.121189 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.135531 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:53Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.140701 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.140744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.140757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.140774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.140788 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.157829 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:53Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.162679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.162745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.162763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.162785 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.162803 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.179193 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:53Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.183951 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.184012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.184022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.184058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.184070 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.195825 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:53Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.200585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.200630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.200642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.200670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.200684 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.215307 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:53Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.215448 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.216980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.217031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.217046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.217068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.217081 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.302181 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.302386 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.302877 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.302999 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.303081 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.303155 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.303221 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:53 crc kubenswrapper[4810]: E1003 06:56:53.303306 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.319602 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.319632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.319640 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.319654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.319664 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.422665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.422726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.422739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.422759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.422774 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.526043 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.526124 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.526147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.526201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.526223 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.629481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.629561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.629580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.629611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.629633 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.732558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.732629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.732641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.732664 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.732756 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.836169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.836220 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.836233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.836249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.836261 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.938914 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.939233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.939359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.939484 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:53 crc kubenswrapper[4810]: I1003 06:56:53.939668 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:53Z","lastTransitionTime":"2025-10-03T06:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.043400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.044325 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.044501 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.044644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.044787 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.147683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.147735 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.147749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.147797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.147820 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.251382 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.251658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.251803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.251924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.252019 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.302529 4810 scope.go:117] "RemoveContainer" containerID="c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.355621 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.356022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.356124 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.356283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.356395 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.460591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.460661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.460682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.460725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.460745 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.563821 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.563880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.563910 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.563931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.563948 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.667066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.667697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.667723 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.667765 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.667788 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.703406 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/1.log" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.712359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.713015 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.742471 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.772171 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.772931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.773046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.773068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.773097 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.773130 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.797739 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.822806 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.848641 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.870275 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.876355 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.876428 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.876450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.876483 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.876500 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.891344 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.909166 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.924040 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.946095 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.953409 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.962390 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.962437 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.979143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.979210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.979225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.979250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.979269 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:54Z","lastTransitionTime":"2025-10-03T06:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:54 crc kubenswrapper[4810]: I1003 06:56:54.993115 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:54Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.016583 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.030104 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.041959 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.058119 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.076057 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.081325 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.081399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.081413 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.081431 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.081470 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.089117 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.100230 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.115916 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.129453 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.152788 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.166003 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.180540 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.185415 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.185442 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.185450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.185464 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.185475 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.195206 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.207803 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.226186 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.242587 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.257835 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.283969 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.287735 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.287878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.287990 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.288083 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.288186 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.301910 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.301963 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.301823 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.301916 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:55 crc kubenswrapper[4810]: E1003 06:56:55.302040 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:55 crc kubenswrapper[4810]: E1003 06:56:55.302264 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:55 crc kubenswrapper[4810]: E1003 06:56:55.302344 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.302370 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:55 crc kubenswrapper[4810]: E1003 06:56:55.302556 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.326570 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.338763 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.351211 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.361369 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.391452 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.391514 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.391524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.391546 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.391559 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.494350 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.494407 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.494417 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.494436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.494452 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.598021 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.598112 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.598136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.598169 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.598195 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.701380 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.701505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.701530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.701562 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.701586 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.718320 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/2.log" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.719291 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/1.log" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.722482 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" exitCode=1 Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.722865 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.722987 4810 scope.go:117] "RemoveContainer" containerID="c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.723819 4810 scope.go:117] "RemoveContainer" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" Oct 03 06:56:55 crc kubenswrapper[4810]: E1003 06:56:55.724094 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.746179 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.769082 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.799284 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.804939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.804970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.804982 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.805000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.805013 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.822936 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.844520 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.866442 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.883170 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.908578 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.908648 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.908666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.908690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.908708 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:55Z","lastTransitionTime":"2025-10-03T06:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.920456 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.941468 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.960832 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:55 crc kubenswrapper[4810]: I1003 06:56:55.991949 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c21c291073c3a6ddce81ed813a615f4987e5d1e48ca4fec6fd9245d48a74a68b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"message\\\":\\\", AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.34\\\\\\\", Port:8888, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 06:56:39.880039 6240 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI1003 06:56:39.880059 6240 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880074 6240 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-whnpx\\\\nI1003 06:56:39.880085 6240 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-whnpx in node crc\\\\nF1003 06:56:39.880091 6240 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: fai\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:55Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.010271 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.011808 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.011854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.011873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.011925 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.011944 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.030803 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.056185 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.075112 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.094304 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.114773 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.114855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.114945 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.114964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.114988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.115009 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.129127 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.218612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.218688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.218706 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.218734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.218752 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.322616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.322692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.322716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.322745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.322769 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.425807 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.425934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.425962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.425993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.426016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.529220 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.529274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.529284 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.529300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.529311 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.632846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.632933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.632955 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.632980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.633000 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.729112 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/2.log" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.732501 4810 scope.go:117] "RemoveContainer" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" Oct 03 06:56:56 crc kubenswrapper[4810]: E1003 06:56:56.732736 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.734711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.734756 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.734767 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.734787 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.734799 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.751846 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.770458 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.786948 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.804719 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.825122 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.837503 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.837591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.837614 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.837649 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.837672 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.847744 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.870005 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.893116 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.911258 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.931427 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.940603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.940645 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.940658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.940680 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.940695 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:56Z","lastTransitionTime":"2025-10-03T06:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.946862 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.960105 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.976568 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:56 crc kubenswrapper[4810]: I1003 06:56:56.992355 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:56Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.010089 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.030745 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.044199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.044252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.044270 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.044296 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.044311 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.061005 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.075394 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.147301 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.147377 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.147394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.147471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.147489 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.251441 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.251536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.251554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.251611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.251629 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.302400 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.302443 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.302529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.302718 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.302760 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.302962 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.303176 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.303404 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.334596 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.351297 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.354766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.354818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.354834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.354860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.354878 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.366987 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.381649 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.399532 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.421147 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.439258 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.454955 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.457270 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.457299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.457308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.457322 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.457333 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.471761 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.504834 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.510608 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.510827 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:57 crc kubenswrapper[4810]: E1003 06:56:57.510943 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:13.510916573 +0000 UTC m=+66.938167318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.522564 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.538984 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.553566 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.559331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.559370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.559378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.559392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.559407 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.567030 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.579971 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.596188 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.609631 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.629393 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:56:57Z is after 2025-08-24T17:21:41Z" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.662875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.662979 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.662994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.663019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.663038 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.765948 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.766008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.766024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.766047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.766071 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.869883 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.869957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.869971 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.869993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.870013 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.974502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.974580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.974598 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.974625 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:57 crc kubenswrapper[4810]: I1003 06:56:57.974648 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:57Z","lastTransitionTime":"2025-10-03T06:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.077620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.077704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.077725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.077752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.077812 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.181223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.181316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.181343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.181383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.181407 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.284992 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.285040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.285053 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.285073 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.285085 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.389192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.389238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.389250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.389269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.389281 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.492234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.492272 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.492285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.492309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.492322 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.595746 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.595824 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.595843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.595861 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.595872 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.698935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.698996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.699022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.699049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.699063 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.802412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.802486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.802507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.802535 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.802556 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.906313 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.906481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.906515 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.906585 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:58 crc kubenswrapper[4810]: I1003 06:56:58.906624 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:58Z","lastTransitionTime":"2025-10-03T06:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.010836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.010924 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.010938 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.010957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.010974 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.113784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.113852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.113865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.113907 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.113924 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.217580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.217647 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.217663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.217697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.217717 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.231348 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.231535 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.231590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231642 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:57:31.231604541 +0000 UTC m=+84.658855266 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.231706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231774 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.231785 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231787 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231950 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231969 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231983 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231853 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:31.231826107 +0000 UTC m=+84.659077022 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.231851 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.232105 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.232123 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.232051 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:31.232040583 +0000 UTC m=+84.659291508 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.232219 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:31.232194017 +0000 UTC m=+84.659444742 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.232234 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:31.232228168 +0000 UTC m=+84.659478903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.302493 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.302557 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.302624 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.302656 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.302748 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.302853 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.302961 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:56:59 crc kubenswrapper[4810]: E1003 06:56:59.303018 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.320456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.320511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.320527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.320545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.320559 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.423778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.423877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.423937 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.423966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.423985 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.527750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.527813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.527825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.527847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.527862 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.631489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.631581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.631607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.631641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.631667 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.734987 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.735097 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.735112 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.735135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.735150 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.838546 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.838618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.838634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.838658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.838675 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.941524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.941580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.941594 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.941613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:56:59 crc kubenswrapper[4810]: I1003 06:56:59.941626 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:56:59Z","lastTransitionTime":"2025-10-03T06:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.043662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.043695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.043703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.043716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.043726 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.146789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.146859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.146879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.146947 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.146971 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.250154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.250237 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.250261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.250292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.250316 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.352973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.353012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.353022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.353038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.353051 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.456654 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.456706 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.456718 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.456738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.456749 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.561000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.561090 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.561114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.561144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.561172 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.663601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.663666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.663684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.663710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.663734 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.767263 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.767378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.767397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.767419 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.767438 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.870522 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.870577 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.870594 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.870616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.870633 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.974259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.974309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.974323 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.974343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:00 crc kubenswrapper[4810]: I1003 06:57:00.974360 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:00Z","lastTransitionTime":"2025-10-03T06:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.077081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.077139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.077160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.077182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.077199 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.180001 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.180038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.180047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.180061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.180071 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.282884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.282982 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.282999 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.283027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.283045 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.302024 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.302074 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.302024 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.302157 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:01 crc kubenswrapper[4810]: E1003 06:57:01.302226 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:01 crc kubenswrapper[4810]: E1003 06:57:01.302353 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:01 crc kubenswrapper[4810]: E1003 06:57:01.302464 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:01 crc kubenswrapper[4810]: E1003 06:57:01.302590 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.386702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.386761 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.386771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.386790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.386801 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.490053 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.490111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.490124 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.490143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.490154 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.594259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.594391 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.594412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.594444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.594463 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.698561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.698627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.698644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.698676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.698694 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.802994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.803138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.803163 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.803196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.803218 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.905869 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.905974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.906002 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.906034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:01 crc kubenswrapper[4810]: I1003 06:57:01.906060 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:01Z","lastTransitionTime":"2025-10-03T06:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.008940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.009017 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.009041 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.009071 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.009094 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.112339 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.112387 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.112400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.112417 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.112429 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.215609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.215688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.215711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.215742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.215766 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.318463 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.318516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.318534 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.318556 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.318576 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.421153 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.421302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.421327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.421406 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.421431 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.525479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.525524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.525538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.525555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.525564 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.628456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.628516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.628535 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.628561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.628580 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.731571 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.731642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.731659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.731685 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.731706 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.834150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.834215 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.834230 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.834246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.834261 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.937857 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.937919 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.937930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.937944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:02 crc kubenswrapper[4810]: I1003 06:57:02.937955 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:02Z","lastTransitionTime":"2025-10-03T06:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.040728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.040804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.040823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.040889 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.040995 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.144698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.144774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.144792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.144816 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.144836 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.248181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.248242 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.248264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.248293 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.248315 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.302170 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.302209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.302335 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.302184 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.302377 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.302446 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.302513 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.302595 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.350828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.350884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.350926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.350950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.350967 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.454263 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.454320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.454332 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.454351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.454366 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.530557 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.530632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.530657 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.530693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.530723 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.553862 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:03Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.559757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.559817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.559826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.559845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.559856 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.589467 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:03Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.595790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.595841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.595853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.595876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.595907 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.616536 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:03Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.621864 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.621928 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.621943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.621964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.621999 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.643601 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:03Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.649677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.649733 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.649783 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.649806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.649822 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.672113 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:03Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:03 crc kubenswrapper[4810]: E1003 06:57:03.672317 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.675110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.675166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.675185 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.675213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.675240 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.778801 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.778886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.778938 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.778964 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.778982 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.881943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.882108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.882152 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.882191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.882217 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.986291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.986345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.986358 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.986381 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:03 crc kubenswrapper[4810]: I1003 06:57:03.986395 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:03Z","lastTransitionTime":"2025-10-03T06:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.089603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.089675 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.089692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.089727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.089750 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.194304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.194376 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.194399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.194430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.194454 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.298114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.298168 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.298189 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.298214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.298232 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.401461 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.401520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.401538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.401572 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.401609 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.504884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.505027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.505048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.505074 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.505096 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.608373 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.608451 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.608473 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.608506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.608600 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.712103 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.712205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.712236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.712271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.712299 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.815601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.815755 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.815774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.815793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.815807 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.919520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.919577 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.919588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.919612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:04 crc kubenswrapper[4810]: I1003 06:57:04.919625 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:04Z","lastTransitionTime":"2025-10-03T06:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.023397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.023461 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.023479 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.023506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.023525 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.128038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.128083 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.128095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.128117 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.128129 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.231689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.231770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.231788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.232223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.232282 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.301583 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.301609 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.301776 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:05 crc kubenswrapper[4810]: E1003 06:57:05.302191 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:05 crc kubenswrapper[4810]: E1003 06:57:05.302351 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:05 crc kubenswrapper[4810]: E1003 06:57:05.302540 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.302673 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:05 crc kubenswrapper[4810]: E1003 06:57:05.302829 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.336069 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.336132 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.336143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.336164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.336179 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.439812 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.440132 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.440151 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.440176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.440194 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.543511 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.543611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.543634 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.543663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.543687 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.647379 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.647472 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.647495 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.647528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.647611 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.751219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.751300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.751320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.751350 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.751368 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.854607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.854658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.854670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.854724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.854739 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.958619 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.958696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.958738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.958772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:05 crc kubenswrapper[4810]: I1003 06:57:05.958797 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:05Z","lastTransitionTime":"2025-10-03T06:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.061306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.061408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.061426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.061450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.061470 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.164710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.164744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.164753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.164766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.164776 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.267759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.267838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.267862 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.267935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.267964 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.370419 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.370505 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.370586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.370623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.370644 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.473433 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.473503 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.473524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.473558 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.473582 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.577709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.577770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.577785 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.577811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.577829 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.680703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.680784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.680805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.680836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.680875 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.783550 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.783650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.783669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.783694 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.783711 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.887513 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.887576 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.887591 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.887612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.887629 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.990691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.990774 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.990793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.990818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:06 crc kubenswrapper[4810]: I1003 06:57:06.990837 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:06Z","lastTransitionTime":"2025-10-03T06:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.094394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.094436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.094448 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.094464 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.094476 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.197678 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.197729 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.197742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.197759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.197779 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.301802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.301994 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.302025 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.302316 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.302111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.302783 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.302881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: E1003 06:57:07.303285 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.303568 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: E1003 06:57:07.303680 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.303621 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: E1003 06:57:07.303948 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:07 crc kubenswrapper[4810]: E1003 06:57:07.303570 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.327028 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.349878 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.369593 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.388711 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.406463 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.408703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.408854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.408887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.408953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.408979 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.443505 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.462883 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.480690 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.511490 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.512070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.512392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.512412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.512431 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.512447 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.528621 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.545620 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.557168 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.571433 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.588718 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.601446 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.616564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.616629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.616643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.616662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.616723 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.628683 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.646453 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.664294 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:07Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.720345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.720579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.720642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.720732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.720847 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.825194 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.825575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.825744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.825888 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.826087 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.930003 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.930063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.930075 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.930094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:07 crc kubenswrapper[4810]: I1003 06:57:07.930113 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:07Z","lastTransitionTime":"2025-10-03T06:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.032060 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.032106 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.032116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.032131 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.032141 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.134807 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.134845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.134856 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.134874 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.134917 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.237789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.237872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.237966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.238005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.238034 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.340616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.340663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.340673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.340685 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.340695 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.444346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.445191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.445283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.445317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.445709 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.549514 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.549611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.550087 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.550175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.550519 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.654420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.654495 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.654521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.654553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.654579 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.758054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.758306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.758387 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.758548 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.758704 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.862155 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.862203 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.862216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.862234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.862246 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.966375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.966446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.966474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.966506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:08 crc kubenswrapper[4810]: I1003 06:57:08.966532 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:08Z","lastTransitionTime":"2025-10-03T06:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.071177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.071215 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.071226 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.071241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.071252 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.174742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.174832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.174870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.174942 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.174966 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.278211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.278254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.278265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.278281 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.278293 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.302100 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.302199 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:09 crc kubenswrapper[4810]: E1003 06:57:09.302274 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.302100 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:09 crc kubenswrapper[4810]: E1003 06:57:09.302322 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.302199 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:09 crc kubenswrapper[4810]: E1003 06:57:09.302383 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:09 crc kubenswrapper[4810]: E1003 06:57:09.302476 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.380569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.380630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.380645 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.380668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.380683 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.489407 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.489493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.489516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.489534 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.489548 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.592805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.592854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.592867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.592884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.592920 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.696128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.696180 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.696202 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.696229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.696249 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.799563 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.799609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.799625 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.799692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.799711 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.902518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.902563 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.902580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.902603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:09 crc kubenswrapper[4810]: I1003 06:57:09.902621 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:09Z","lastTransitionTime":"2025-10-03T06:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.005526 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.005557 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.005567 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.005579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.005589 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.109063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.109101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.109115 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.109132 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.109145 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.212070 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.212166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.212178 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.212198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.212209 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.315231 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.315266 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.315285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.315300 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.315314 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.418261 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.418310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.418324 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.418339 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.418352 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.521745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.521807 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.521819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.521849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.521891 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.625556 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.625605 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.625622 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.625643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.625660 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.732657 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.732715 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.732751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.732780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.732803 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.835475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.835554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.835588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.835618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.835639 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.938667 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.938747 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.938777 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.938805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:10 crc kubenswrapper[4810]: I1003 06:57:10.938825 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:10Z","lastTransitionTime":"2025-10-03T06:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.041354 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.041432 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.041453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.041476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.041493 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.145193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.145247 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.145256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.145278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.145293 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.248418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.248474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.248487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.248507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.248525 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.301875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.302011 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:11 crc kubenswrapper[4810]: E1003 06:57:11.302074 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:11 crc kubenswrapper[4810]: E1003 06:57:11.302187 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.301924 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:11 crc kubenswrapper[4810]: E1003 06:57:11.302332 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.301915 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:11 crc kubenswrapper[4810]: E1003 06:57:11.302428 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.350935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.350979 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.350989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.351005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.351015 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.454288 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.454346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.454360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.454385 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.454403 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.558062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.558122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.558139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.558160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.558173 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.660960 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.661004 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.661016 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.661033 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.661045 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.764039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.764101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.764113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.764130 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.764145 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.866867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.866948 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.866965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.866985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.867000 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.969530 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.969596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.969611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.969630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:11 crc kubenswrapper[4810]: I1003 06:57:11.969644 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:11Z","lastTransitionTime":"2025-10-03T06:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.072985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.073036 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.073046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.073061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.073073 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.175218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.175249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.175258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.175270 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.175280 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.279031 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.279078 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.279088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.279102 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.279114 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.303426 4810 scope.go:117] "RemoveContainer" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" Oct 03 06:57:12 crc kubenswrapper[4810]: E1003 06:57:12.303806 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.382567 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.382603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.382611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.382659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.382670 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.484603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.484665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.484678 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.484696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.484708 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.587425 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.587465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.587475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.587491 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.587503 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.689960 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.690034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.690060 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.690096 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.690120 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.793398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.793453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.793468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.793486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.793499 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.896120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.896160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.896171 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.896184 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.896193 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.998211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.998255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.998264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.998299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:12 crc kubenswrapper[4810]: I1003 06:57:12.998312 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:12Z","lastTransitionTime":"2025-10-03T06:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.101079 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.101135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.101149 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.101164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.101176 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.204173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.204214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.204224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.204240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.204250 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.301820 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.301875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.301928 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.301999 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.302033 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.302201 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.302310 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.302397 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.306067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.306109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.306120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.306137 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.306148 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.409128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.409238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.409738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.409834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.410200 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.513308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.513371 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.513385 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.513401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.513412 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.593939 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.594115 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:57:13 crc kubenswrapper[4810]: E1003 06:57:13.594211 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:57:45.59418506 +0000 UTC m=+99.021435825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.617601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.617664 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.617684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.617710 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.617727 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.721258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.721299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.721315 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.721339 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.721359 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.825008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.825176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.825197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.825228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.825251 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.928299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.928337 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.928348 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.928364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:13 crc kubenswrapper[4810]: I1003 06:57:13.928376 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:13Z","lastTransitionTime":"2025-10-03T06:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.031234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.031289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.031304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.031324 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.031339 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.062276 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.062330 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.062344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.062364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.062378 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.079129 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:14Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.085789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.085838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.085853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.085871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.085883 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.099225 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:14Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.103369 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.103507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.103575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.103644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.103714 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.115972 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:14Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.120724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.120767 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.120776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.120790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.120800 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.133335 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:14Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.136550 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.136614 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.136628 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.136646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.136684 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.147677 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:14Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:14 crc kubenswrapper[4810]: E1003 06:57:14.148191 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.149689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.149809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.149912 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.150015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.150104 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.252878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.252941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.252950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.252963 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.252975 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.356309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.356797 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.357073 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.357334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.357526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.461955 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.462383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.462549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.462702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.462933 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.567227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.567294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.567308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.567326 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.567339 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.671136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.671226 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.671251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.671283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.671310 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.774028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.774116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.774133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.774157 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.774180 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.877501 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.877581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.877621 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.877639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.877650 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.980780 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.980829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.980843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.980866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:14 crc kubenswrapper[4810]: I1003 06:57:14.980945 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:14Z","lastTransitionTime":"2025-10-03T06:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.083837 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.083919 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.083930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.083950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.083963 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.187994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.188066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.188084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.188109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.188131 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.292029 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.292096 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.292114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.292138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.292157 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.301682 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.301714 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.301719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.301718 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:15 crc kubenswrapper[4810]: E1003 06:57:15.301929 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:15 crc kubenswrapper[4810]: E1003 06:57:15.301963 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:15 crc kubenswrapper[4810]: E1003 06:57:15.302033 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:15 crc kubenswrapper[4810]: E1003 06:57:15.302148 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.404015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.404053 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.404061 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.404078 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.404090 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.507196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.507238 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.507246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.507259 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.507269 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.610162 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.610252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.610314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.610342 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.610962 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.714207 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.715320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.715462 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.715627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.715753 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.800363 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/0.log" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.800851 4810 generic.go:334] "Generic (PLEG): container finished" podID="f9421086-70f1-441e-9aa0-5ac57a048c89" containerID="22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06" exitCode=1 Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.801095 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerDied","Data":"22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.801820 4810 scope.go:117] "RemoveContainer" containerID="22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.818040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.818143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.818157 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.818173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.818185 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.820820 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.836696 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.861123 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.875570 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.905261 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.920049 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.920092 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.920105 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.920125 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.920139 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:15Z","lastTransitionTime":"2025-10-03T06:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.926965 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.943380 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.962264 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.977616 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:15 crc kubenswrapper[4810]: I1003 06:57:15.995794 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:15Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.009856 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.022738 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.022777 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.022788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.022802 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.022814 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.025264 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.053281 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.071631 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.089387 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.101975 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.113339 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.122947 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.128311 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.128380 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.128399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.128429 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.128453 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.231360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.231426 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.231444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.231468 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.231486 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.335145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.335229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.335240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.335256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.335269 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.438784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.438869 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.438887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.438934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.438948 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.542216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.542271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.542286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.542307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.542323 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.645504 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.645556 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.645565 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.645580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.645591 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.749672 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.749745 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.749767 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.749798 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.749823 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.808852 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/0.log" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.808962 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerStarted","Data":"899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.832607 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.852999 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.853220 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.853273 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.853291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.853317 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.853339 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.868796 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.880811 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.895072 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.912296 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.938862 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.956960 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.957229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.957331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.957437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.957526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:16Z","lastTransitionTime":"2025-10-03T06:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:16 crc kubenswrapper[4810]: I1003 06:57:16.987154 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:16Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.003237 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.021740 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.041742 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.060709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.060761 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.060772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.060793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.060807 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.066825 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.086099 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.104061 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.123829 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.147731 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.163986 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.164041 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.164055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.164076 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.164094 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.170182 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.186758 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.266107 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.266172 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.266195 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.266223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.266248 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.301782 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.301796 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.301886 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:17 crc kubenswrapper[4810]: E1003 06:57:17.302013 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.302057 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:17 crc kubenswrapper[4810]: E1003 06:57:17.302141 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:17 crc kubenswrapper[4810]: E1003 06:57:17.302311 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:17 crc kubenswrapper[4810]: E1003 06:57:17.302453 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.323467 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.338199 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.365064 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.369252 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.369364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.369443 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.369487 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.369583 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.377558 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.390966 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.408311 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.426687 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.443874 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.459326 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.473030 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.473059 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.473067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.473102 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.473114 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.479299 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.495340 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.512445 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.527363 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.543562 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.555843 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.568535 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.575357 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.575421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.575432 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.575450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.575461 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.579000 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.608861 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:17Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.678265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.678329 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.678345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.678368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.678388 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.780854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.780931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.780941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.780959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.780974 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.883401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.883470 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.883484 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.883553 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.883573 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.986156 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.986240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.986263 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.986298 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:17 crc kubenswrapper[4810]: I1003 06:57:17.986324 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:17Z","lastTransitionTime":"2025-10-03T06:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.089058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.089121 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.089130 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.089147 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.089157 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.192119 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.192383 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.192518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.192628 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.192718 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.301173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.301216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.301228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.301240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.301251 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.404938 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.405011 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.405035 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.405065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.405090 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.508264 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.508324 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.508336 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.508354 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.508366 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.611742 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.611819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.611845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.611872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.611891 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.714398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.714446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.714459 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.714475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.714486 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.816959 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.817015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.817034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.817063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.817085 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.920108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.920490 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.920559 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.920659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:18 crc kubenswrapper[4810]: I1003 06:57:18.920745 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:18Z","lastTransitionTime":"2025-10-03T06:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.023579 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.023632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.023646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.023669 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.023684 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.126578 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.126620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.126629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.126642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.126652 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.229784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.229858 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.229882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.229952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.229975 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.302002 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:19 crc kubenswrapper[4810]: E1003 06:57:19.302256 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.302324 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.302387 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:19 crc kubenswrapper[4810]: E1003 06:57:19.302483 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.302445 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:19 crc kubenswrapper[4810]: E1003 06:57:19.302646 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:19 crc kubenswrapper[4810]: E1003 06:57:19.302751 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.332824 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.333297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.333394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.333494 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.333674 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.436592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.436992 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.437146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.437316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.437481 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.540072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.540114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.540125 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.540143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.540156 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.643361 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.643422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.643440 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.643469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.643487 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.746976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.747039 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.747059 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.747084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.747102 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.851012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.851090 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.851108 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.851133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.851153 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.954494 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.954566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.954578 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.954593 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:19 crc kubenswrapper[4810]: I1003 06:57:19.954605 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:19Z","lastTransitionTime":"2025-10-03T06:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.057448 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.057520 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.057538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.057564 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.057584 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.160545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.160605 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.160623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.160647 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.160668 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.263631 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.263700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.263723 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.263752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.263774 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.365671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.365711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.365722 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.365736 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.365745 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.467885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.467945 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.467958 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.467973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.467985 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.570613 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.570679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.570697 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.570720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.570737 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.673320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.673420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.673439 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.673504 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.673528 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.776241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.776314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.776327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.776344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.776355 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.879107 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.879198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.879223 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.879266 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.879291 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.982517 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.982611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.982662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.982689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:20 crc kubenswrapper[4810]: I1003 06:57:20.982707 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:20Z","lastTransitionTime":"2025-10-03T06:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.087260 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.087304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.087316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.087334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.087348 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.191158 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.191232 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.191253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.191283 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.191305 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.294702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.294762 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.294781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.294810 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.294830 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.306128 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.306271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:21 crc kubenswrapper[4810]: E1003 06:57:21.306420 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.306734 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.306793 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:21 crc kubenswrapper[4810]: E1003 06:57:21.306960 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:21 crc kubenswrapper[4810]: E1003 06:57:21.307309 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:21 crc kubenswrapper[4810]: E1003 06:57:21.307619 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.398164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.398209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.398218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.398233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.398244 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.501872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.501958 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.501975 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.502005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.502029 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.605476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.606314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.606339 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.606369 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.606390 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.709279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.709347 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.709370 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.709396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.709416 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.812257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.812325 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.812349 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.812380 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.812406 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.915582 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.915631 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.915648 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.915665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:21 crc kubenswrapper[4810]: I1003 06:57:21.915677 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:21Z","lastTransitionTime":"2025-10-03T06:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.018540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.018623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.018648 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.018679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.018700 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.121243 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.121322 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.121346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.121379 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.121403 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.225204 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.225274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.225291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.225316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.225335 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.328869 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.328992 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.329020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.329047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.329069 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.432222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.432286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.432303 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.432327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.432343 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.535855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.535942 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.535967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.535997 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.536016 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.638456 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.638525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.638549 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.638580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.638601 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.741372 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.741431 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.741453 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.741475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.741496 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.844611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.844659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.844668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.844683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.844695 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.949094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.949183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.949209 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.949233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:22 crc kubenswrapper[4810]: I1003 06:57:22.949253 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:22Z","lastTransitionTime":"2025-10-03T06:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.057312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.057404 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.057423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.057449 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.057467 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.160588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.160642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.160655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.160673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.160686 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.263772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.263958 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.263978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.264005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.264022 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.301540 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.301580 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.301568 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:23 crc kubenswrapper[4810]: E1003 06:57:23.301741 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:23 crc kubenswrapper[4810]: E1003 06:57:23.301915 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.302823 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:23 crc kubenswrapper[4810]: E1003 06:57:23.302974 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:23 crc kubenswrapper[4810]: E1003 06:57:23.302988 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.367993 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.368066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.368090 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.368120 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.368139 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.471315 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.471686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.471881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.472306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.472380 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.577030 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.577090 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.577114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.577145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.577170 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.680766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.680818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.680835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.680887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.680942 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.783632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.783707 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.783731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.783764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.783788 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.886996 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.887054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.887066 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.887084 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.887098 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.990559 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.990816 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.990838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.990871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:23 crc kubenswrapper[4810]: I1003 06:57:23.990928 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:23Z","lastTransitionTime":"2025-10-03T06:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.094874 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.095168 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.095190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.095216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.095236 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.199265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.199328 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.199349 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.199375 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.199396 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302313 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302337 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302355 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.302851 4810 scope.go:117] "RemoveContainer" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.339050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.339152 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.339171 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.339233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.339253 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.377811 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.385161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.385227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.385244 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.385274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.385292 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.407157 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.413804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.413851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.413861 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.413881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.413904 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.433876 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.441312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.441381 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.441403 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.441430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.441449 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.459661 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.465865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.465926 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.465936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.465956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.465966 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.480197 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: E1003 06:57:24.480466 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.482531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.482595 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.482616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.482644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.482664 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.586256 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.586314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.586327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.586347 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.586371 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.689345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.689394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.689405 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.689422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.689435 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.791720 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.791761 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.791770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.791785 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.791795 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.839514 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/2.log" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.841969 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.842326 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.856272 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.868783 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.881294 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.896734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.896784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.896795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.896813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.896825 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.904149 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.923202 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.946972 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.959280 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.972002 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.988325 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:24Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.999498 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.999521 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:24 crc kubenswrapper[4810]: I1003 06:57:24.999529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:24.999545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:24.999557 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:24Z","lastTransitionTime":"2025-10-03T06:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.001821 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.016170 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.031205 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.043409 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.068311 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.082229 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.096762 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.101705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.101752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.101765 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.101784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.101798 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.116792 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.133065 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.204768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.204818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.204834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.204856 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.204873 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.302516 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.302559 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.302621 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.302559 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:25 crc kubenswrapper[4810]: E1003 06:57:25.302720 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:25 crc kubenswrapper[4810]: E1003 06:57:25.302837 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:25 crc kubenswrapper[4810]: E1003 06:57:25.303001 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:25 crc kubenswrapper[4810]: E1003 06:57:25.303124 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.308047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.308099 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.308118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.308142 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.308160 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.411128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.411202 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.411219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.411245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.411265 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.514548 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.514608 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.514619 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.514636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.514650 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.618343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.618401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.618411 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.618430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.618444 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.721727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.721953 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.721987 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.722017 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.722045 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.825838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.825930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.825950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.825977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.825998 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.848427 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/3.log" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.849612 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/2.log" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.854232 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" exitCode=1 Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.854302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.854356 4810 scope.go:117] "RemoveContainer" containerID="9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.857453 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 06:57:25 crc kubenswrapper[4810]: E1003 06:57:25.858190 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.884064 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.910202 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.929465 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.929506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.929518 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.929536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.929548 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:25Z","lastTransitionTime":"2025-10-03T06:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.947868 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a2a4e75fd4b3157d3d0bfeaa4ab073dbb3dc760f50307542e566d397343e702\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:56:55Z\\\",\\\"message\\\":\\\"h for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"53c717ca-2174-4315-bb03-c937a9c0d9b6\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.188\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1003 06:56:55.306856 6444 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:25Z\\\",\\\"message\\\":\\\"ing new object: *v1.Pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167907 6821 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-vnb5p in node crc\\\\nI1003 06:57:25.167914 6821 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-vnb5p after 0 failed attempt(s)\\\\nI1003 06:57:25.167918 6821 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167886 6821 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 06:57:25.167935 6821 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167953 6821 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167962 6821 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nF1003 06:57:25.167983 6821 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.968608 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:25 crc kubenswrapper[4810]: I1003 06:57:25.992296 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:25Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.015036 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.032734 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.032810 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.032850 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.032881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.032931 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.035603 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.055186 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.077950 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.099868 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.126156 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.136721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.136794 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.136821 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.136854 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.136877 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.152136 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.170048 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.206836 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.228377 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.240581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.240684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.240701 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.240726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.240743 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.246182 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.262689 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.281071 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.344051 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.344110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.344127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.344154 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.344174 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.447712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.447784 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.447804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.447833 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.447855 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.551098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.551155 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.551173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.551199 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.551217 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.653838 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.653885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.653923 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.653946 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.653962 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.757752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.757826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.757845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.757881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.757949 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.860285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.860347 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.860360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.860382 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.860395 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.862366 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/3.log" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.867847 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 06:57:26 crc kubenswrapper[4810]: E1003 06:57:26.868167 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.890859 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.911942 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.932721 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.960062 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:25Z\\\",\\\"message\\\":\\\"ing new object: *v1.Pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167907 6821 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-vnb5p in node crc\\\\nI1003 06:57:25.167914 6821 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-vnb5p after 0 failed attempt(s)\\\\nI1003 06:57:25.167918 6821 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167886 6821 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 06:57:25.167935 6821 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167953 6821 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167962 6821 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nF1003 06:57:25.167983 6821 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.964106 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.964210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.964234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.964265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.964290 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:26Z","lastTransitionTime":"2025-10-03T06:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:26 crc kubenswrapper[4810]: I1003 06:57:26.983168 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:26Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.003799 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.024701 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.046092 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.062639 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.069028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.069128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.069144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.069166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.069180 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.080816 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.104415 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.122133 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.140399 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.156347 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.172612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.172666 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.172681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.172700 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.172712 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.180360 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.198437 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.216876 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.233791 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.274554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.274590 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.274603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.274618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.274631 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.301844 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.301928 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:27 crc kubenswrapper[4810]: E1003 06:57:27.302009 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.302088 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.301845 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:27 crc kubenswrapper[4810]: E1003 06:57:27.302161 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:27 crc kubenswrapper[4810]: E1003 06:57:27.302220 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:27 crc kubenswrapper[4810]: E1003 06:57:27.302275 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.316766 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.351095 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.364988 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.377881 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.378015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.378040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.378334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.378439 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.387057 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.401838 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.420710 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.436400 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.454983 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.472784 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.481316 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.481360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.481378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.481401 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.481414 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.492413 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:25Z\\\",\\\"message\\\":\\\"ing new object: *v1.Pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167907 6821 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-vnb5p in node crc\\\\nI1003 06:57:25.167914 6821 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-vnb5p after 0 failed attempt(s)\\\\nI1003 06:57:25.167918 6821 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167886 6821 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 06:57:25.167935 6821 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167953 6821 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167962 6821 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nF1003 06:57:25.167983 6821 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.508452 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.523563 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.536479 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.547110 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.559755 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.573024 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.584617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.584671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.584689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.584716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.584737 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.588479 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.610055 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:27Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.689181 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.689244 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.689268 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.689302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.689323 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.792744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.792786 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.792799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.792818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.792832 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.898362 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.898430 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.898447 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.898471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:27 crc kubenswrapper[4810]: I1003 06:57:27.898492 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:27Z","lastTransitionTime":"2025-10-03T06:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.002415 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.002481 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.002500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.002529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.002550 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.106211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.106288 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.106306 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.106336 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.106359 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.209633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.209694 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.209725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.209752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.209769 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.313123 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.313200 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.313219 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.313248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.313265 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.416525 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.416589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.416604 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.416627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.416643 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.521294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.521381 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.521394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.521416 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.521429 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.624516 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.624581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.624605 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.624631 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.624648 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.727651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.727694 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.727707 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.727728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.727742 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.831811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.831876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.831945 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.831976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.832001 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.934956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.935014 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.935026 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.935045 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:28 crc kubenswrapper[4810]: I1003 06:57:28.935058 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:28Z","lastTransitionTime":"2025-10-03T06:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.038200 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.038278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.038298 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.038330 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.038354 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.141859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.141962 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.141985 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.142020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.142045 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.245581 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.245743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.245766 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.245793 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.245853 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.301598 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.301694 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.301865 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.302379 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:29 crc kubenswrapper[4810]: E1003 06:57:29.302362 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:29 crc kubenswrapper[4810]: E1003 06:57:29.302614 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:29 crc kubenswrapper[4810]: E1003 06:57:29.302835 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:29 crc kubenswrapper[4810]: E1003 06:57:29.303059 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.349973 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.350072 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.350094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.350126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.350148 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.453542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.453586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.453597 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.453618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.453630 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.563175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.563265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.563289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.563326 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.563353 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.666476 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.666540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.666583 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.666605 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.666625 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.770243 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.770319 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.770338 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.770368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.770392 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.873787 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.873860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.873877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.873942 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.873962 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.977601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.977651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.977663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.977680 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:29 crc kubenswrapper[4810]: I1003 06:57:29.977693 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:29Z","lastTransitionTime":"2025-10-03T06:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.081563 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.081642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.081662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.081698 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.081720 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.185676 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.185741 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.185763 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.185792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.185813 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.289360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.289463 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.289480 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.289503 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.289520 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.392811 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.392939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.392980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.393018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.393040 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.496667 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.496781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.496803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.496836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.496858 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.599740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.600118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.600334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.600500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.600652 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.703570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.704007 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.704101 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.704198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.704284 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.807279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.807345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.807364 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.807397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.807419 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.910633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.910750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.910768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.910788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:30 crc kubenswrapper[4810]: I1003 06:57:30.910800 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:30Z","lastTransitionTime":"2025-10-03T06:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.014612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.014693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.014711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.014776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.014796 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.117648 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.117752 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.117771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.117808 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.117829 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.221446 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.221527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.221545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.221580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.221604 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.294705 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.294953 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.294880806 +0000 UTC m=+148.722131591 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.295026 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.295115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.295181 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.295249 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295274 4810 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295385 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.295358349 +0000 UTC m=+148.722609124 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295417 4810 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295460 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295503 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295530 4810 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295573 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.295531653 +0000 UTC m=+148.722782418 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295611 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.295588724 +0000 UTC m=+148.722839699 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295652 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295712 4810 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295730 4810 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.295832 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.29580259 +0000 UTC m=+148.723053525 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.301953 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.301982 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.302114 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.302092 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.302336 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.302372 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.302460 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:31 crc kubenswrapper[4810]: E1003 06:57:31.302820 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.321732 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.324686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.324740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.324764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.324796 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.324818 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.428855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.428972 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.429000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.429033 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.429056 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.532941 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.533006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.533025 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.533050 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.533068 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.636769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.636855 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.636882 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.636955 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.636981 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.740609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.740677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.740695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.740724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.740746 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.844241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.844346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.844365 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.844393 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.844420 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.948791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.949299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.949493 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.949670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:31 crc kubenswrapper[4810]: I1003 06:57:31.949808 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:31Z","lastTransitionTime":"2025-10-03T06:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.053867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.053965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.053979 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.054005 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.054020 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.157674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.158460 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.158740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.158936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.159065 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.262540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.262632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.262662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.262695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.262717 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.366708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.366822 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.366844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.366872 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.366927 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.470399 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.470444 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.470454 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.470474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.470486 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.574148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.574216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.574234 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.574263 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.574282 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.678110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.678245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.678269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.678305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.678329 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.781611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.781679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.781696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.781719 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.781739 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.884632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.884728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.884754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.884794 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.884820 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.988312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.988378 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.988390 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.988414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:32 crc kubenswrapper[4810]: I1003 06:57:32.988427 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:32Z","lastTransitionTime":"2025-10-03T06:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.092136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.092196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.092205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.092229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.092242 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.195552 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.195906 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.195916 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.195933 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.195943 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.297980 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.298034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.298048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.298067 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.298079 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.301879 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:33 crc kubenswrapper[4810]: E1003 06:57:33.302016 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.302038 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.302057 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:33 crc kubenswrapper[4810]: E1003 06:57:33.302142 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:33 crc kubenswrapper[4810]: E1003 06:57:33.302190 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.302256 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:33 crc kubenswrapper[4810]: E1003 06:57:33.302383 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.400809 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.400851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.400862 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.400878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.400888 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.504776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.504831 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.504853 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.504885 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.504939 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.609062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.609176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.609190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.609222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.609242 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.712629 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.712683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.712699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.712725 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.712741 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.816972 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.817012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.817024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.817046 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.817058 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.920094 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.920144 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.920155 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.920182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:33 crc kubenswrapper[4810]: I1003 06:57:33.920196 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:33Z","lastTransitionTime":"2025-10-03T06:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.024312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.024423 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.024442 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.024478 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.024504 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.127616 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.127681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.127699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.127730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.127753 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.231228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.231286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.231305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.231333 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.231352 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.334266 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.334351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.334373 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.334403 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.334424 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.438650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.438718 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.438740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.438768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.438790 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.498673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.498764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.498785 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.498812 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.498831 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.518007 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.524414 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.524655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.524834 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.525114 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.525288 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.544994 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.551122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.551191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.551216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.551243 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.551265 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.568235 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.574717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.574791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.574808 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.574833 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.574851 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.590884 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.595421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.595482 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.595497 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.595517 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.595530 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.615799 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:34Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:34 crc kubenswrapper[4810]: E1003 06:57:34.616218 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.618241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.618297 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.618311 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.618330 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.618343 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.721575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.722193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.722395 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.722630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.722850 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.827417 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.827482 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.827499 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.827526 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.827546 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.931860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.931929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.931939 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.931956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:34 crc kubenswrapper[4810]: I1003 06:57:34.931968 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:34Z","lastTransitionTime":"2025-10-03T06:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.035206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.035268 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.035285 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.035307 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.035326 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.138440 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.138547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.138565 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.138588 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.138606 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.242799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.242866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.242880 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.242936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.242953 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.302206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.302415 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:35 crc kubenswrapper[4810]: E1003 06:57:35.302664 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.302742 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.302715 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:35 crc kubenswrapper[4810]: E1003 06:57:35.302926 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:35 crc kubenswrapper[4810]: E1003 06:57:35.303009 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:35 crc kubenswrapper[4810]: E1003 06:57:35.303079 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.345929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.346004 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.346022 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.346047 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.346065 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.449696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.449767 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.449788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.449819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.449841 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.553529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.553994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.554173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.554351 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.554561 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.658489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.658791 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.658965 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.659093 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.659195 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.762064 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.762129 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.762151 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.762193 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.762213 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.865759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.865810 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.865826 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.865846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.865859 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.969236 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.969280 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.969291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.969312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:35 crc kubenswrapper[4810]: I1003 06:57:35.969324 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:35Z","lastTransitionTime":"2025-10-03T06:57:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.072691 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.073065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.073237 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.073411 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.073618 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.177702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.177804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.177817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.177835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.177844 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.288205 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.288267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.288287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.288313 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.288332 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.391862 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.391994 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.392021 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.392057 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.392081 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.496507 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.496577 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.496600 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.496630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.496654 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.600696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.600777 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.600788 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.600806 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.600817 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.703572 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.703633 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.703643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.703665 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.703677 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.806709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.806846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.806865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.806930 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.806975 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.909727 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.909779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.909795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.909818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:36 crc kubenswrapper[4810]: I1003 06:57:36.909836 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:36Z","lastTransitionTime":"2025-10-03T06:57:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.012728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.012790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.012813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.012844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.012866 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.116805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.116871 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.116887 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.116940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.116958 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.220519 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.220620 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.220647 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.220687 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.220716 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.301847 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.301950 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.301931 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.302475 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:37 crc kubenswrapper[4810]: E1003 06:57:37.302466 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:37 crc kubenswrapper[4810]: E1003 06:57:37.302626 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:37 crc kubenswrapper[4810]: E1003 06:57:37.302730 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:37 crc kubenswrapper[4810]: E1003 06:57:37.302890 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.321443 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.324781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.324840 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.324852 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.324875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.324919 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.346302 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.361084 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.377506 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.394190 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.412969 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.427045 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.427109 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.427127 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.427153 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.427172 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.434460 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.450706 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.463114 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.482164 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.497081 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.529083 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.530027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.530095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.530111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.530136 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.530156 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.551060 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.567728 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.592011 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:25Z\\\",\\\"message\\\":\\\"ing new object: *v1.Pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167907 6821 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-vnb5p in node crc\\\\nI1003 06:57:25.167914 6821 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-vnb5p after 0 failed attempt(s)\\\\nI1003 06:57:25.167918 6821 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167886 6821 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 06:57:25.167935 6821 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167953 6821 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167962 6821 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nF1003 06:57:25.167983 6821 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.609282 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.623162 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee0378a5-824f-4a59-a22e-5bb2361ebce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ee80464125edb57275df5080d958b19a4423fab16e5e62e4be550c84a514a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.633197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.633286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.633312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.633344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.633368 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.648160 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.664501 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:37Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.736540 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.736618 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.736642 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.736672 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.736698 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.839214 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.839288 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.839309 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.839338 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.839358 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.944732 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.944785 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.944795 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.944819 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:37 crc kubenswrapper[4810]: I1003 06:57:37.944833 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:37Z","lastTransitionTime":"2025-10-03T06:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.047844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.047936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.047958 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.047987 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.048010 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.153477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.153524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.153536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.153552 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.153562 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.257175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.257245 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.257265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.257289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.257306 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.360160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.360204 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.360213 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.360232 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.360247 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.462955 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.463034 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.463052 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.463082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.463101 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.566182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.566229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.566239 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.566276 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.566290 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.670116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.670198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.670216 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.670246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.670267 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.774210 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.774310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.774331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.774360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.774384 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.877557 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.877631 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.877651 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.877688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.877732 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.982257 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.982334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.982353 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.982418 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:38 crc kubenswrapper[4810]: I1003 06:57:38.982439 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:38Z","lastTransitionTime":"2025-10-03T06:57:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.086619 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.086690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.086711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.086743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.086762 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.190126 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.190188 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.190203 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.190227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.190244 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.293539 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.293612 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.293631 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.293658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.293678 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.302000 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.302036 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.302174 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:39 crc kubenswrapper[4810]: E1003 06:57:39.302171 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.302242 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:39 crc kubenswrapper[4810]: E1003 06:57:39.302353 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:39 crc kubenswrapper[4810]: E1003 06:57:39.302761 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:39 crc kubenswrapper[4810]: E1003 06:57:39.302963 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.396828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.396917 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.396934 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.396956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.396971 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.500529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.500580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.500592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.500611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.500627 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.603649 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.603726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.603744 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.603770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.603788 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.707160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.707250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.707265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.707292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.707308 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.811952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.812255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.812271 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.812291 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.812303 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.916424 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.916508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.916533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.916603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:39 crc kubenswrapper[4810]: I1003 06:57:39.916633 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:39Z","lastTransitionTime":"2025-10-03T06:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.021778 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.021829 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.021841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.021859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.021872 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.124436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.124495 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.124508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.124527 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.124541 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.227191 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.227227 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.227235 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.227250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.227260 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.330265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.330320 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.330334 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.330349 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.330374 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.433467 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.433529 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.433545 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.433566 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.433582 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.536387 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.536568 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.536597 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.536632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.536659 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.640638 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.640716 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.640726 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.640748 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.640760 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.745255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.745333 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.745353 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.745384 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.745404 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.849161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.849233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.849253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.849284 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.849303 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.969113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.969176 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.969194 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.969217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:40 crc kubenswrapper[4810]: I1003 06:57:40.969236 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:40Z","lastTransitionTime":"2025-10-03T06:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.072989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.073055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.073078 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.073116 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.073140 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.177206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.177287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.177305 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.177344 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.177366 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.281056 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.281167 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.281190 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.281255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.281273 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.301920 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.302026 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.302057 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.302179 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:41 crc kubenswrapper[4810]: E1003 06:57:41.302723 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:41 crc kubenswrapper[4810]: E1003 06:57:41.302829 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:41 crc kubenswrapper[4810]: E1003 06:57:41.302867 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:41 crc kubenswrapper[4810]: E1003 06:57:41.303016 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.303227 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 06:57:41 crc kubenswrapper[4810]: E1003 06:57:41.303514 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.385589 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.385646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.385661 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.385681 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.385693 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.488646 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.488685 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.488696 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.488711 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.488722 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.592161 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.592225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.592248 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.592276 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.592298 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.695517 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.695570 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.695587 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.695611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.695628 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.799221 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.799286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.799310 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.799338 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.799361 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.901746 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.901801 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.901814 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.901841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:41 crc kubenswrapper[4810]: I1003 06:57:41.901856 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:41Z","lastTransitionTime":"2025-10-03T06:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.004989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.005068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.005087 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.005123 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.005142 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.108751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.108825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.108844 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.108875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.108930 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.212143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.212644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.212764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.212879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.213043 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.318177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.318247 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.318265 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.318294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.318313 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.420919 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.421318 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.421412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.421531 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.421611 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.525664 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.525730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.525743 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.525765 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.525779 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.629361 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.629437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.629447 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.629466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.629476 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.732684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.732739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.732749 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.732771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.732782 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.835233 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.835323 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.835335 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.835386 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.835401 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.939477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.939524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.939533 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.939550 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:42 crc kubenswrapper[4810]: I1003 06:57:42.939561 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:42Z","lastTransitionTime":"2025-10-03T06:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.043478 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.043547 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.043561 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.043587 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.043603 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.147508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.147586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.147604 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.147636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.147656 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.252110 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.252274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.252301 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.252336 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.252357 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.302472 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.302751 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.302842 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.302916 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:43 crc kubenswrapper[4810]: E1003 06:57:43.303121 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:43 crc kubenswrapper[4810]: E1003 06:57:43.303207 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:43 crc kubenswrapper[4810]: E1003 06:57:43.303341 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:43 crc kubenswrapper[4810]: E1003 06:57:43.303467 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.355956 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.356019 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.356035 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.356053 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.356067 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.459689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.459751 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.459771 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.460128 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.460407 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.564058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.564412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.564607 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.564843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.565092 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.668543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.668704 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.668724 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.668750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.668767 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.772554 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.772630 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.772650 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.772679 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.772702 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.875759 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.875825 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.875847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.875873 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.875919 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.979044 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.979098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.979111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.979135 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:43 crc kubenswrapper[4810]: I1003 06:57:43.979151 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:43Z","lastTransitionTime":"2025-10-03T06:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.083850 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.083935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.083952 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.083977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.083997 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.187770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.187849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.187870 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.187932 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.187953 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.291682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.291754 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.291776 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.291805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.291827 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.395770 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.395835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.395851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.395878 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.395919 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.498671 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.498737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.498757 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.498843 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.498860 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.602728 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.602804 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.602821 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.602846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.602865 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.706294 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.706388 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.706400 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.706422 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.706438 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.810290 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.810359 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.810372 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.810396 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.810408 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.909603 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.909677 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.909699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.909760 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.909785 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: E1003 06:57:44.933117 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.939249 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.939308 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.939327 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.939353 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.939371 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: E1003 06:57:44.964564 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.971220 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.971274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.971289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.971311 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.971329 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:44 crc kubenswrapper[4810]: E1003 06:57:44.991574 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:44Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.996621 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.996663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.996675 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.996717 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:44 crc kubenswrapper[4810]: I1003 06:57:44.996731 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:44Z","lastTransitionTime":"2025-10-03T06:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.015982 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.020961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.021008 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.021020 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.021037 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.021050 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.038939 4810 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cf31509f-a3d9-4881-a83a-a21fceb0f392\\\",\\\"systemUUID\\\":\\\"54332209-85c4-4e31-bef8-717ad4ff0760\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:45Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.039277 4810 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.041748 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.041805 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.041818 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.041832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.041844 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.146756 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.146851 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.146875 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.147363 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.147634 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.250925 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.251000 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.251055 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.251081 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.251100 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.302516 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.302807 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.302856 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.302871 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.304620 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.304476 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.304759 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.304941 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.354009 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.354076 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.354096 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.354118 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.354136 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.457541 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.457673 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.457702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.457731 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.457751 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.560779 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.560866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.560883 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.560957 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.560977 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.664018 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.664089 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.664146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.664177 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.664205 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.667207 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.667457 4810 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:57:45 crc kubenswrapper[4810]: E1003 06:57:45.667566 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs podName:08c7f4c0-52b1-4047-9da9-c6a76b0e06e7 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:49.667532545 +0000 UTC m=+163.094783310 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs") pod "network-metrics-daemon-drrxm" (UID: "08c7f4c0-52b1-4047-9da9-c6a76b0e06e7") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.768543 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.768611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.768623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.768641 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.768654 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.871978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.872048 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.872068 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.872095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.872116 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.974773 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.974832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.974847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.974866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:45 crc kubenswrapper[4810]: I1003 06:57:45.974883 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:45Z","lastTransitionTime":"2025-10-03T06:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.077750 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.077817 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.077839 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.077867 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.077889 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.181523 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.181592 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.181611 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.181636 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.181656 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.285139 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.285215 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.285239 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.285267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.285290 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.387772 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.387842 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.387860 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.387886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.387968 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.491088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.491148 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.491171 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.491206 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.491228 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.594276 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.594343 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.594360 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.594384 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.594404 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.696824 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.696879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.696945 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.696970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.696987 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.799580 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.799632 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.799643 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.799663 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.799675 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.902408 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.902466 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.902483 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.902506 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:46 crc kubenswrapper[4810]: I1003 06:57:46.902524 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:46Z","lastTransitionTime":"2025-10-03T06:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.004781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.004836 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.004849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.004869 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.004880 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.108012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.108111 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.108133 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.108165 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.108187 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.211083 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.211143 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.211160 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.211184 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.211202 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.301837 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.301961 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.301993 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:47 crc kubenswrapper[4810]: E1003 06:57:47.302105 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.302129 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:47 crc kubenswrapper[4810]: E1003 06:57:47.302218 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:47 crc kubenswrapper[4810]: E1003 06:57:47.302316 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:47 crc kubenswrapper[4810]: E1003 06:57:47.302390 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.314076 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.314196 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.314269 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.314304 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.314375 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.321588 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b92e05f0-dd9a-4fd2-8eef-9ca9f9def860\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e08332b0cf6ac85c2589d3bec3fae5de8fae8dea7f8790241ef5bfcbfcc797f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a95445ed0d63749e461fc0e86d27b474e0e374fab9e603019b88c7efcdd7fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4747b628060a1275ffd6a56e6b8fbecf053fc56b96672837b0042116a03c4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eb92ee787ede0f6f498e3bf89522a8bc7833b281faa604afe3606f75131edd02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.339376 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cce49d494ea2a4174add0a3f5532eca27afe61c7fa807f6e25c594efe4bee7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z6wrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-z8f25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.354390 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a71d33c-dc75-4c28-bda0-0b3793de7de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dccb87a733522fc4051eb563645de5cf31771f112be5c8129e5a9401964d4574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dabb6c04687bab80e8d445646be37ca4fc771038255b2f7893e482dbd4f51455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12e5700a0bb0990ea353f7ab6f4a9d40af7b42644b012fd6d128bd3b60e6b3e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10976608ee3a17f42e8a3da4ed22a63ec76ac99f5a83f1afc595ac4f02df2cfe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ed48c72320fe90c52ca79b48e25b7f5ee5bb0d909061e009ed17d1472833488\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c92a73139cf0c55ae7768be160ae7a5889bd80ac2129c375191ab417284ff415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0249a73e048fe4e38731bd39caad2fc3132c9d53735e1e610f5df8509254c89\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9rcz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6fdnr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.373967 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce9ff89b-afa1-492e-bdf2-cf012ddd2f34\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5ef2fcb17d33a46afa178f70754b021c2962c9ec91e99830d622b07b102d31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccac1833655d5c2436c7564aa95846abaa866b1db78a1cdf3db84fff990e7e8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5e3ef938e44204e0e2fd239d98d97ee026f2c255b86ab94da6f78dced593283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0670c7b36893ee38bdbacf577bd40e39cee7cfd0afcdc3a4f4f9a6f2d4096c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e9c4fabba8c14b97af3c9e4c5a06530342ad9551c3c72ed70d09f0303cde1c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e97f57a43b2303be3f279ccdd09f5556c3d300402843f6d72cb8bdff95bb2149\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e0513ffeda77f8bb43b660b0e1ac0a180ec58a099703cb15cb2f5ffc913f9a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41f45fa1c71a850f1a84f52a44e23cf24df87477ab07d91c2d1332d80840dbec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.385426 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.399605 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.412403 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c1cb407e6a1c1da2286d3c36866dffac7b42b07230395e0cc77c56d8ac29629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd86037c2da65d5d9cc11cd6310454968d0440cf355bd7fc22bb24838e0c05c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.418991 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.419024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.419037 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.419054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.419066 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.427840 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.443372 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8pnks" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f9421086-70f1-441e-9aa0-5ac57a048c89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:57:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:15Z\\\",\\\"message\\\":\\\"2025-10-03T06:56:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6\\\\n2025-10-03T06:56:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_16c03fd4-dd4a-49b7-92bd-b6d532807bf6 to /host/opt/cni/bin/\\\\n2025-10-03T06:56:30Z [verbose] multus-daemon started\\\\n2025-10-03T06:56:30Z [verbose] Readiness Indicator file check\\\\n2025-10-03T06:57:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:57:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bzk26\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8pnks\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.453704 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-drrxm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bgs4v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-drrxm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.469827 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee0378a5-824f-4a59-a22e-5bb2361ebce6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6ee80464125edb57275df5080d958b19a4423fab16e5e62e4be550c84a514a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae41caf9a2d9d0b38d37254ba3425b2a734a9fe44f72e6ae664781d3f7b3d094\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.490628 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44be835e-1ae0-474e-a37f-4aa63f8920f5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a560786b579c49d9ce46fb91ef45ec643aae3891690c4c868ea855ca3f3e1fa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cdb480f634d8093290f0027899bf463b3d4da4cf9d2426709ec13b1c8879e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3070738247cb2a272c43f7a07fc9d92425ccf10b152e9a07f28099738d9cf46e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cef89a639820f9c84213350e10d3c9d60db4072438772b16a98219a946110774\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.505182 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://49c5a7168b89d25e54930685c65c87b923f07da4c2661309a20be99aae4226a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.522437 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.522509 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.522532 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.522562 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.522583 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.528796 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T06:57:25Z\\\",\\\"message\\\":\\\"ing new object: *v1.Pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167907 6821 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-vnb5p in node crc\\\\nI1003 06:57:25.167914 6821 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-vnb5p after 0 failed attempt(s)\\\\nI1003 06:57:25.167918 6821 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-vnb5p\\\\nI1003 06:57:25.167886 6821 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1003 06:57:25.167935 6821 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167953 6821 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI1003 06:57:25.167962 6821 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nF1003 06:57:25.167983 6821 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped a\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:57:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b4bdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-whnpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.545816 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44db27b9-37f2-443a-8c72-3109af8b80bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://249c0497df0b8af6b7a68e0b6af6cad0a9f6552e36820e2c1e9734ab7cc04f9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b93301c77ee354b61202d8e145fde12e30681b03784fee544edf0cc915306d0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mz4ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-pz9mk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.558784 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4b235f75-14bc-4ced-adb6-c522a40a7bdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://420e661db0423df4e0c9a5879239e1867cdc6bc279f0ccae9fc82f4a76842d57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c2e4a800fbf060433677fe1ebedef1ea2bfe9ab42e9707ea0e1802a14c24abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5228c086bc36fd866c1499f082f903419fa3673334236a07bff20bcfce43b06d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d383075a74df80499c16a3151a279cf4c30167fd06de5c08ea66432c760b9f79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f72c8b5b51a8aa59a06009da9b89ae59e47f411b7d7f566678e32a151ae8dba1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T06:56:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1003 06:56:20.927554 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 06:56:20.929780 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2912697103/tls.crt::/tmp/serving-cert-2912697103/tls.key\\\\\\\"\\\\nI1003 06:56:26.365646 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 06:56:26.373410 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 06:56:26.373435 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 06:56:26.373456 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 06:56:26.373461 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 06:56:26.377787 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1003 06:56:26.377809 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1003 06:56:26.377811 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 06:56:26.377814 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 06:56:26.377848 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 06:56:26.377853 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 06:56:26.377858 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 06:56:26.377861 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 06:56:26.379864 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a6eb1b4994bffe9eab51a1c97c6327858f5ce305eea9b78f472742f51f4abf6\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94d076b338a0e1f4e1aa276483420f9e0b45b9c51c22627fe172f9da70c0a167\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T06:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T06:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.575454 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76030625187ffcc302718bc81c7c7a34ab47ceb0155adfe40895420a7a401c20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.589688 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d69n4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d67ab19-ac19-4673-ade0-a35ffb299e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9299b7c0a1edfff07f1a3abcaa131156b070d58980b9ca81f1b5b13ba0a64e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxq2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d69n4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.602089 4810 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-vnb5p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59e2d65e-f3c1-4b45-8c19-b033bd5f3aac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T06:56:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd8ce586c6c3ce067f3d73acbee488921afe3f125b0d750e98ee225b5a04690a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T06:56:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sxvrz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T06:56:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-vnb5p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T06:57:47Z is after 2025-08-24T17:21:41Z" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.625609 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.625659 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.625668 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.625688 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.625698 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.729474 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.729537 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.729555 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.729586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.729617 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.833312 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.833368 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.833392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.833420 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.833445 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.936670 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.936769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.936790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.937331 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:47 crc kubenswrapper[4810]: I1003 06:57:47.937420 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:47Z","lastTransitionTime":"2025-10-03T06:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.041224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.041286 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.041299 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.041324 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.041339 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.144598 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.144674 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.144692 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.144719 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.144739 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.247830 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.247884 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.247944 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.247970 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.247986 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.350524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.350575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.350584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.350599 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.350610 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.452658 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.452690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.452699 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.452712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.452721 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.556150 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.556240 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.556268 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.556295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.556314 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.659866 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.660001 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.660028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.660063 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.660092 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.763586 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.763662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.763680 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.763708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.763735 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.867253 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.867314 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.867333 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.867361 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.867381 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.971092 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.971433 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.971477 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.971504 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:48 crc kubenswrapper[4810]: I1003 06:57:48.971526 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:48Z","lastTransitionTime":"2025-10-03T06:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.073879 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.073977 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.073992 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.074012 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.074029 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.177185 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.177241 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.177251 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.177279 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.177289 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.279929 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.279978 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.279986 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.279998 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.280006 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.302377 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.302423 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.302433 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.302483 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:49 crc kubenswrapper[4810]: E1003 06:57:49.302533 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:49 crc kubenswrapper[4810]: E1003 06:57:49.302726 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:49 crc kubenswrapper[4810]: E1003 06:57:49.302865 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:49 crc kubenswrapper[4810]: E1003 06:57:49.303545 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.382936 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.382981 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.382995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.383009 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.383020 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.486737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.486803 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.486823 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.486846 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.486864 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.590626 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.590695 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.590712 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.590737 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.590755 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.694386 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.694457 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.694475 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.694500 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.694517 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.797967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.798358 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.798490 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.798653 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.798791 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.901974 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.902028 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.902038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.902058 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:49 crc kubenswrapper[4810]: I1003 06:57:49.902071 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:49Z","lastTransitionTime":"2025-10-03T06:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.004935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.005027 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.005054 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.005086 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.005115 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.109542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.109623 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.109637 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.109662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.109677 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.212820 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.213197 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.213328 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.213438 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.213539 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.317201 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.317258 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.317270 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.317292 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.317305 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.420841 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.420950 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.420967 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.420989 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.421003 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.524282 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.524345 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.524363 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.524392 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.524411 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.627864 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.628325 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.628417 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.628542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.628642 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.731684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.732146 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.732217 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.732287 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.732346 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.836024 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.836078 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.836095 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.836119 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.836137 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.939489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.940302 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.940614 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.940786 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:50 crc kubenswrapper[4810]: I1003 06:57:50.940964 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:50Z","lastTransitionTime":"2025-10-03T06:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.044284 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.044644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.044847 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.045092 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.045275 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.149508 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.149877 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.150104 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.150338 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.150552 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.254721 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.254792 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.254813 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.254845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.254869 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.302261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:51 crc kubenswrapper[4810]: E1003 06:57:51.302455 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.302635 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.302811 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:51 crc kubenswrapper[4810]: E1003 06:57:51.302958 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:51 crc kubenswrapper[4810]: E1003 06:57:51.303150 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.303424 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:51 crc kubenswrapper[4810]: E1003 06:57:51.303835 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.358768 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.358831 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.358849 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.358876 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.358961 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.462471 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.462596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.462624 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.462653 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.462676 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.566087 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.566166 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.566224 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.566254 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.566276 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.669610 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.669683 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.669708 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.669739 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.669763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.773436 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.773537 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.773559 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.773590 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.773610 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.876045 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.876088 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.876097 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.876113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.876123 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.979075 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.979228 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.979250 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.979315 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:51 crc kubenswrapper[4810]: I1003 06:57:51.979339 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:51Z","lastTransitionTime":"2025-10-03T06:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.082575 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.082617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.082627 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.082644 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.082653 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.185828 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.185883 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.185935 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.185961 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.185978 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.289038 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.289098 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.289122 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.289145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.289166 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.392211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.392267 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.392278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.392295 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.392309 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.494703 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.494740 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.494753 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.494769 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.494780 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.597635 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.597690 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.597709 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.597730 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.597745 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.701040 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.701153 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.701179 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.701255 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.701277 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.804346 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.804435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.804459 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.804489 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.804510 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.907132 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.907173 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.907183 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.907198 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:52 crc kubenswrapper[4810]: I1003 06:57:52.907211 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:52Z","lastTransitionTime":"2025-10-03T06:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.011145 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.011229 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.011246 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.011278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.011303 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.115100 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.115175 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.115195 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.115222 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.115242 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.218211 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.218272 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.218289 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.218313 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.218331 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.303212 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.303218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:53 crc kubenswrapper[4810]: E1003 06:57:53.303428 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.303446 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.303218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:53 crc kubenswrapper[4810]: E1003 06:57:53.303611 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:53 crc kubenswrapper[4810]: E1003 06:57:53.303537 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:53 crc kubenswrapper[4810]: E1003 06:57:53.303813 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.320861 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.321225 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.321512 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.321682 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.321947 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.427789 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.428274 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.428486 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.428639 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.428783 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.532790 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.532886 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.532976 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.533006 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.533026 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.636686 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.636756 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.636773 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.636799 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.636817 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.739421 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.739522 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.739542 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.739569 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.739589 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.842060 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.842469 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.842601 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.842689 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.842760 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.946528 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.946584 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.946596 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.946617 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:53 crc kubenswrapper[4810]: I1003 06:57:53.946630 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:53Z","lastTransitionTime":"2025-10-03T06:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.048859 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.048931 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.048943 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.048966 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.048980 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.152450 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.152524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.152544 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.152574 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.152596 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.256988 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.257435 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.257536 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.257662 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.257763 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.303354 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 06:57:54 crc kubenswrapper[4810]: E1003 06:57:54.303686 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-whnpx_openshift-ovn-kubernetes(88c6d2ac-d97b-43a1-8bf7-3cc367fe108e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.361488 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.361995 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.362113 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.362218 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.362313 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.465398 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.465940 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.466062 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.466164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.466263 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.570781 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.570835 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.570845 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.570865 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.570876 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.674764 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.675278 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.675397 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.675524 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.675626 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.778496 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.778675 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.778702 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.778735 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.778754 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.882082 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.882182 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.882200 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.882243 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.882263 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.986065 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.986138 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.986164 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.986192 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:54 crc kubenswrapper[4810]: I1003 06:57:54.986220 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:54Z","lastTransitionTime":"2025-10-03T06:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.089395 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.089478 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.089502 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.089538 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.089561 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:55Z","lastTransitionTime":"2025-10-03T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.192655 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.192684 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.192693 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.192705 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.192714 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:55Z","lastTransitionTime":"2025-10-03T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.296332 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.296379 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.296394 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.296412 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.296431 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:55Z","lastTransitionTime":"2025-10-03T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.301980 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.302126 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.302258 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.302176 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:55 crc kubenswrapper[4810]: E1003 06:57:55.302143 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:55 crc kubenswrapper[4810]: E1003 06:57:55.302533 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:55 crc kubenswrapper[4810]: E1003 06:57:55.302576 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:55 crc kubenswrapper[4810]: E1003 06:57:55.302660 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.365485 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.365832 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.366042 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.366202 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.366337 4810 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T06:57:55Z","lastTransitionTime":"2025-10-03T06:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.442639 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8"] Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.443959 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.448076 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.448085 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.448869 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.449593 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.484589 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=61.484518881 podStartE2EDuration="1m1.484518881s" podCreationTimestamp="2025-10-03 06:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.466598235 +0000 UTC m=+108.893848980" watchObservedRunningTime="2025-10-03 06:57:55.484518881 +0000 UTC m=+108.911769616" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.508275 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podStartSLOduration=88.5082555 podStartE2EDuration="1m28.5082555s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.484516951 +0000 UTC m=+108.911767706" watchObservedRunningTime="2025-10-03 06:57:55.5082555 +0000 UTC m=+108.935506235" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.527639 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6fdnr" podStartSLOduration=88.527615354 podStartE2EDuration="1m28.527615354s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.511916784 +0000 UTC m=+108.939167559" watchObservedRunningTime="2025-10-03 06:57:55.527615354 +0000 UTC m=+108.954866089" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.560461 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=85.560427038 podStartE2EDuration="1m25.560427038s" podCreationTimestamp="2025-10-03 06:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.559577715 +0000 UTC m=+108.986828450" watchObservedRunningTime="2025-10-03 06:57:55.560427038 +0000 UTC m=+108.987677803" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.580531 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.580806 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944b28dd-5656-4765-aa44-a527384e9188-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.580963 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/944b28dd-5656-4765-aa44-a527384e9188-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.581022 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.581265 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/944b28dd-5656-4765-aa44-a527384e9188-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.661968 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8pnks" podStartSLOduration=88.661933751 podStartE2EDuration="1m28.661933751s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.648090911 +0000 UTC m=+109.075341656" watchObservedRunningTime="2025-10-03 06:57:55.661933751 +0000 UTC m=+109.089184486" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/944b28dd-5656-4765-aa44-a527384e9188-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682561 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/944b28dd-5656-4765-aa44-a527384e9188-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682646 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/944b28dd-5656-4765-aa44-a527384e9188-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.682775 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944b28dd-5656-4765-aa44-a527384e9188-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.683413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/944b28dd-5656-4765-aa44-a527384e9188-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.692467 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/944b28dd-5656-4765-aa44-a527384e9188-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.701207 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.701169564 podStartE2EDuration="24.701169564s" podCreationTimestamp="2025-10-03 06:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.67727028 +0000 UTC m=+109.104521035" watchObservedRunningTime="2025-10-03 06:57:55.701169564 +0000 UTC m=+109.128420339" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.701650 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/944b28dd-5656-4765-aa44-a527384e9188-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s75r8\" (UID: \"944b28dd-5656-4765-aa44-a527384e9188\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.719322 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=89.719282874 podStartE2EDuration="1m29.719282874s" podCreationTimestamp="2025-10-03 06:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.701972054 +0000 UTC m=+109.129222789" watchObservedRunningTime="2025-10-03 06:57:55.719282874 +0000 UTC m=+109.146533619" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.764635 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.835432 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-pz9mk" podStartSLOduration=87.835409728 podStartE2EDuration="1m27.835409728s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.810291984 +0000 UTC m=+109.237542719" watchObservedRunningTime="2025-10-03 06:57:55.835409728 +0000 UTC m=+109.262660483" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.855145 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.855122992 podStartE2EDuration="1m29.855122992s" podCreationTimestamp="2025-10-03 06:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.835259775 +0000 UTC m=+109.262510510" watchObservedRunningTime="2025-10-03 06:57:55.855122992 +0000 UTC m=+109.282373727" Oct 03 06:57:55 crc kubenswrapper[4810]: I1003 06:57:55.879106 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d69n4" podStartSLOduration=89.879077846 podStartE2EDuration="1m29.879077846s" podCreationTimestamp="2025-10-03 06:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.868248724 +0000 UTC m=+109.295499459" watchObservedRunningTime="2025-10-03 06:57:55.879077846 +0000 UTC m=+109.306328581" Oct 03 06:57:56 crc kubenswrapper[4810]: I1003 06:57:56.033832 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" event={"ID":"944b28dd-5656-4765-aa44-a527384e9188","Type":"ContainerStarted","Data":"47e39444be3201d3d70eaff74a336c83ce80abc6f10c8cda1e49709bf404ca67"} Oct 03 06:57:56 crc kubenswrapper[4810]: I1003 06:57:56.034217 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" event={"ID":"944b28dd-5656-4765-aa44-a527384e9188","Type":"ContainerStarted","Data":"6f5fd8b18a03454ac11ffbbe8309b2ed8e295889589b2394efd9682933eec51e"} Oct 03 06:57:56 crc kubenswrapper[4810]: I1003 06:57:56.053190 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vnb5p" podStartSLOduration=89.053158919 podStartE2EDuration="1m29.053158919s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:55.882213678 +0000 UTC m=+109.309464403" watchObservedRunningTime="2025-10-03 06:57:56.053158919 +0000 UTC m=+109.480409654" Oct 03 06:57:57 crc kubenswrapper[4810]: I1003 06:57:57.302120 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:57 crc kubenswrapper[4810]: I1003 06:57:57.302167 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:57 crc kubenswrapper[4810]: I1003 06:57:57.302277 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:57 crc kubenswrapper[4810]: E1003 06:57:57.304963 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:57 crc kubenswrapper[4810]: E1003 06:57:57.305171 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:57 crc kubenswrapper[4810]: I1003 06:57:57.305226 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:57 crc kubenswrapper[4810]: E1003 06:57:57.305435 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:57 crc kubenswrapper[4810]: E1003 06:57:57.305673 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:57:59 crc kubenswrapper[4810]: I1003 06:57:59.302528 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:57:59 crc kubenswrapper[4810]: I1003 06:57:59.302693 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:57:59 crc kubenswrapper[4810]: I1003 06:57:59.302734 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:57:59 crc kubenswrapper[4810]: I1003 06:57:59.302763 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:57:59 crc kubenswrapper[4810]: E1003 06:57:59.302871 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:57:59 crc kubenswrapper[4810]: E1003 06:57:59.303007 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:57:59 crc kubenswrapper[4810]: E1003 06:57:59.303126 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:57:59 crc kubenswrapper[4810]: E1003 06:57:59.303181 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:01 crc kubenswrapper[4810]: I1003 06:58:01.302297 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:01 crc kubenswrapper[4810]: I1003 06:58:01.302350 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:01 crc kubenswrapper[4810]: I1003 06:58:01.302386 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:01 crc kubenswrapper[4810]: I1003 06:58:01.302272 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:01 crc kubenswrapper[4810]: E1003 06:58:01.302527 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:01 crc kubenswrapper[4810]: E1003 06:58:01.302623 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:01 crc kubenswrapper[4810]: E1003 06:58:01.302730 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:01 crc kubenswrapper[4810]: E1003 06:58:01.302802 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.055379 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/1.log" Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.055910 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/0.log" Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.055961 4810 generic.go:334] "Generic (PLEG): container finished" podID="f9421086-70f1-441e-9aa0-5ac57a048c89" containerID="899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065" exitCode=1 Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.055990 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerDied","Data":"899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065"} Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.056031 4810 scope.go:117] "RemoveContainer" containerID="22ce3b8603e29acbc921fd5eebb53c998ae4e405a6268a18843b5d96b5f8ea06" Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.056573 4810 scope.go:117] "RemoveContainer" containerID="899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065" Oct 03 06:58:02 crc kubenswrapper[4810]: E1003 06:58:02.056847 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8pnks_openshift-multus(f9421086-70f1-441e-9aa0-5ac57a048c89)\"" pod="openshift-multus/multus-8pnks" podUID="f9421086-70f1-441e-9aa0-5ac57a048c89" Oct 03 06:58:02 crc kubenswrapper[4810]: I1003 06:58:02.085546 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s75r8" podStartSLOduration=95.085521168 podStartE2EDuration="1m35.085521168s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:57:56.054177136 +0000 UTC m=+109.481427871" watchObservedRunningTime="2025-10-03 06:58:02.085521168 +0000 UTC m=+115.512771983" Oct 03 06:58:03 crc kubenswrapper[4810]: I1003 06:58:03.062651 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/1.log" Oct 03 06:58:03 crc kubenswrapper[4810]: I1003 06:58:03.302172 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:03 crc kubenswrapper[4810]: I1003 06:58:03.302233 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:03 crc kubenswrapper[4810]: I1003 06:58:03.302173 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:03 crc kubenswrapper[4810]: E1003 06:58:03.302342 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:03 crc kubenswrapper[4810]: E1003 06:58:03.302447 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:03 crc kubenswrapper[4810]: E1003 06:58:03.302542 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:03 crc kubenswrapper[4810]: I1003 06:58:03.302618 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:03 crc kubenswrapper[4810]: E1003 06:58:03.302707 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:05 crc kubenswrapper[4810]: I1003 06:58:05.301919 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:05 crc kubenswrapper[4810]: I1003 06:58:05.302034 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:05 crc kubenswrapper[4810]: E1003 06:58:05.302123 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:05 crc kubenswrapper[4810]: I1003 06:58:05.302142 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:05 crc kubenswrapper[4810]: E1003 06:58:05.302316 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:05 crc kubenswrapper[4810]: I1003 06:58:05.302366 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:05 crc kubenswrapper[4810]: E1003 06:58:05.302521 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:05 crc kubenswrapper[4810]: E1003 06:58:05.302670 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.294823 4810 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 06:58:07 crc kubenswrapper[4810]: I1003 06:58:07.301443 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.302799 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:07 crc kubenswrapper[4810]: I1003 06:58:07.302868 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:07 crc kubenswrapper[4810]: I1003 06:58:07.302940 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:07 crc kubenswrapper[4810]: I1003 06:58:07.303051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.303409 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.303505 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.303670 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:07 crc kubenswrapper[4810]: E1003 06:58:07.420529 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 06:58:09 crc kubenswrapper[4810]: I1003 06:58:09.302069 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:09 crc kubenswrapper[4810]: E1003 06:58:09.302239 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:09 crc kubenswrapper[4810]: I1003 06:58:09.303070 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:09 crc kubenswrapper[4810]: E1003 06:58:09.303141 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:09 crc kubenswrapper[4810]: I1003 06:58:09.303182 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 06:58:09 crc kubenswrapper[4810]: I1003 06:58:09.303193 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:09 crc kubenswrapper[4810]: I1003 06:58:09.303193 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:09 crc kubenswrapper[4810]: E1003 06:58:09.303245 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:09 crc kubenswrapper[4810]: E1003 06:58:09.303318 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.089118 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/3.log" Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.092207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerStarted","Data":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.092646 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.163798 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podStartSLOduration=103.163770786 podStartE2EDuration="1m43.163770786s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:10.126804913 +0000 UTC m=+123.554055668" watchObservedRunningTime="2025-10-03 06:58:10.163770786 +0000 UTC m=+123.591021561" Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.165408 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-drrxm"] Oct 03 06:58:10 crc kubenswrapper[4810]: I1003 06:58:10.165529 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:10 crc kubenswrapper[4810]: E1003 06:58:10.165675 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:11 crc kubenswrapper[4810]: I1003 06:58:11.302374 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:11 crc kubenswrapper[4810]: I1003 06:58:11.302467 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:11 crc kubenswrapper[4810]: I1003 06:58:11.302484 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:11 crc kubenswrapper[4810]: I1003 06:58:11.302604 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:11 crc kubenswrapper[4810]: E1003 06:58:11.303435 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:11 crc kubenswrapper[4810]: E1003 06:58:11.303775 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:11 crc kubenswrapper[4810]: E1003 06:58:11.303977 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:11 crc kubenswrapper[4810]: E1003 06:58:11.304255 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:12 crc kubenswrapper[4810]: E1003 06:58:12.422794 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 06:58:13 crc kubenswrapper[4810]: I1003 06:58:13.302439 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:13 crc kubenswrapper[4810]: I1003 06:58:13.302826 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:13 crc kubenswrapper[4810]: I1003 06:58:13.302874 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:13 crc kubenswrapper[4810]: E1003 06:58:13.302835 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:13 crc kubenswrapper[4810]: I1003 06:58:13.302836 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:13 crc kubenswrapper[4810]: E1003 06:58:13.303103 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:13 crc kubenswrapper[4810]: E1003 06:58:13.303304 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:13 crc kubenswrapper[4810]: E1003 06:58:13.303448 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:15 crc kubenswrapper[4810]: I1003 06:58:15.301992 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:15 crc kubenswrapper[4810]: I1003 06:58:15.302047 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:15 crc kubenswrapper[4810]: E1003 06:58:15.302183 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:15 crc kubenswrapper[4810]: I1003 06:58:15.302266 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:15 crc kubenswrapper[4810]: I1003 06:58:15.302475 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:15 crc kubenswrapper[4810]: E1003 06:58:15.302462 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:15 crc kubenswrapper[4810]: E1003 06:58:15.302539 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:15 crc kubenswrapper[4810]: E1003 06:58:15.302591 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:16 crc kubenswrapper[4810]: I1003 06:58:16.302059 4810 scope.go:117] "RemoveContainer" containerID="899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065" Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.130224 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/1.log" Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.130748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerStarted","Data":"606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c"} Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.301535 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.301621 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.301714 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:17 crc kubenswrapper[4810]: E1003 06:58:17.301859 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:17 crc kubenswrapper[4810]: I1003 06:58:17.303490 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:17 crc kubenswrapper[4810]: E1003 06:58:17.303576 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:17 crc kubenswrapper[4810]: E1003 06:58:17.303746 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:17 crc kubenswrapper[4810]: E1003 06:58:17.303925 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:17 crc kubenswrapper[4810]: E1003 06:58:17.423571 4810 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 06:58:19 crc kubenswrapper[4810]: I1003 06:58:19.302327 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:19 crc kubenswrapper[4810]: I1003 06:58:19.302396 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:19 crc kubenswrapper[4810]: I1003 06:58:19.302398 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:19 crc kubenswrapper[4810]: I1003 06:58:19.302340 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:19 crc kubenswrapper[4810]: E1003 06:58:19.302576 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:19 crc kubenswrapper[4810]: E1003 06:58:19.302673 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:19 crc kubenswrapper[4810]: E1003 06:58:19.302798 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:19 crc kubenswrapper[4810]: E1003 06:58:19.302947 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:21 crc kubenswrapper[4810]: I1003 06:58:21.301878 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:21 crc kubenswrapper[4810]: I1003 06:58:21.301959 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:21 crc kubenswrapper[4810]: I1003 06:58:21.301941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:21 crc kubenswrapper[4810]: I1003 06:58:21.302145 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:21 crc kubenswrapper[4810]: E1003 06:58:21.302411 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 06:58:21 crc kubenswrapper[4810]: E1003 06:58:21.302523 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 06:58:21 crc kubenswrapper[4810]: E1003 06:58:21.302652 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-drrxm" podUID="08c7f4c0-52b1-4047-9da9-c6a76b0e06e7" Oct 03 06:58:21 crc kubenswrapper[4810]: E1003 06:58:21.302809 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.302548 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.302589 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.302649 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.302849 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.307150 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.307273 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.307306 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.307701 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.310677 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 06:58:23 crc kubenswrapper[4810]: I1003 06:58:23.311329 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.907015 4810 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.959575 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.961472 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.961846 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.962454 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.968504 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.968936 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.970218 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.970736 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.971035 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.972601 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.973162 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.975993 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.978637 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.979539 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.979758 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.982952 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.983387 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.983510 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.983415 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.983626 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.984172 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.984552 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.984713 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tk6p6"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.984959 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.985600 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zthm4"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.986047 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.986416 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.986521 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qhsbc"] Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.986875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.987017 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.993231 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.996090 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 06:58:25 crc kubenswrapper[4810]: I1003 06:58:25.997880 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.001783 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.002508 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.002761 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.004989 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.005170 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.005741 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.006587 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.006856 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.007276 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.007437 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.007495 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.008183 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.025886 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.026374 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.026673 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.026888 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.027349 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.027386 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cn2tr"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.031790 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.049149 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.049669 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.050100 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.051078 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.051308 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.051610 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.051763 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.051976 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.052184 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.052426 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.052612 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.052728 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.052873 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.053012 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.053112 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j9hjt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.053390 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.053528 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.054287 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ng6vq"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.054498 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.054858 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055046 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-kube-api-access-trq2f\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055230 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055275 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bc7f\" (UniqueName: \"kubernetes.io/projected/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-kube-api-access-7bc7f\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055315 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-serving-cert\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055344 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055423 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055461 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhvcj\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-kube-api-access-xhvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055486 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.055803 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.056031 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.054496 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.056425 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057063 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057273 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057498 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057570 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057640 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.057060 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.058572 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.058914 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.059228 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.059364 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.063029 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.063337 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.063756 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.063939 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064128 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064282 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064550 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064785 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064912 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.064890 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065014 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065134 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065227 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065273 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065341 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065384 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065481 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065604 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065234 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065681 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.065847 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.066024 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.066046 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.073292 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.076947 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077066 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077408 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077554 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077694 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077758 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077875 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077945 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077573 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077602 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.078090 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.077638 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.078194 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.079463 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.082133 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.082862 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.083360 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6zfm"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.084613 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.085173 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.085364 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.087308 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.087841 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.088032 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.088166 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.088361 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.088465 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.088571 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.089106 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.089162 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.089253 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.089557 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.089581 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.090460 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.102599 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.103851 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.105314 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.108240 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kkmch"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.110112 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.110331 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.110814 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.111353 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vcdh"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.114499 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-898hd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.114754 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.131718 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.132607 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.134067 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.136628 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.137108 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.137941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.139422 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.140056 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.141771 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.142002 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.142697 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.143559 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.143790 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.144439 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.144746 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.145261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.148203 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xt6fn"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.149231 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.150527 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.151256 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.151413 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.152446 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.152852 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.153420 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.154095 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.154577 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.155429 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156007 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17e82ce0-83fd-416a-aab4-4a20524f89d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156050 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926a87b0-d967-45aa-8aff-13dcbf2c98e1-serving-cert\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcgbk\" (UniqueName: \"kubernetes.io/projected/5a21c48a-596a-409d-8021-1425828a8a76-kube-api-access-zcgbk\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156168 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156194 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-serving-cert\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156217 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156239 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvbq\" (UniqueName: \"kubernetes.io/projected/17e82ce0-83fd-416a-aab4-4a20524f89d4-kube-api-access-hzvbq\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156262 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx62q\" (UniqueName: \"kubernetes.io/projected/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-kube-api-access-zx62q\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156285 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156310 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7gcw\" (UniqueName: \"kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156335 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-audit-dir\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgkx\" (UniqueName: \"kubernetes.io/projected/926a87b0-d967-45aa-8aff-13dcbf2c98e1-kube-api-access-tpgkx\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156418 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-config\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156446 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczhw\" (UniqueName: \"kubernetes.io/projected/9a499f8e-4807-4432-810e-b240eae2b261-kube-api-access-hczhw\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156472 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-client\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156489 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156525 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-encryption-config\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498044ba-75d4-40c6-a7df-3846dc5b82d0-config\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156818 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156857 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a499f8e-4807-4432-810e-b240eae2b261-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156884 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-config\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156822 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156931 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbcp\" (UniqueName: \"kubernetes.io/projected/10cb2fd4-636c-43cc-8c36-50cffb650f27-kube-api-access-pjbcp\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156955 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.156978 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-config\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157001 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs7g6\" (UniqueName: \"kubernetes.io/projected/1c1c2c63-5588-4027-90a1-b78cdbe3b10b-kube-api-access-zs7g6\") pod \"downloads-7954f5f757-ng6vq\" (UID: \"1c1c2c63-5588-4027-90a1-b78cdbe3b10b\") " pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157022 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/498044ba-75d4-40c6-a7df-3846dc5b82d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157044 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-auth-proxy-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157094 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba503e7e-7710-4be5-a871-ff39ff8b1296-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn7zl\" (UniqueName: \"kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157131 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157140 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157167 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-config\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-policies\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157258 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157284 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157308 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157330 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157350 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht76p\" (UniqueName: \"kubernetes.io/projected/daa05454-65ea-4796-a2eb-79d127178570-kube-api-access-ht76p\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157388 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157428 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-serving-cert\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157460 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-client\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157484 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-serving-cert\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba503e7e-7710-4be5-a871-ff39ff8b1296-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdt6\" (UniqueName: \"kubernetes.io/projected/db163501-b4f0-48b4-a558-e7f3d9d1c835-kube-api-access-4wdt6\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157623 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498044ba-75d4-40c6-a7df-3846dc5b82d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157648 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157673 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157714 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157772 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhvcj\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-kube-api-access-xhvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157909 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157956 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-images\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.157983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158001 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158022 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158040 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-kube-api-access-trq2f\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158060 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-etcd-serving-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-image-import-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158099 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-trusted-ca\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158160 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-audit\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba503e7e-7710-4be5-a871-ff39ff8b1296-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158251 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f9ncj"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158283 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.158877 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159004 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-node-pullsecrets\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159047 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159065 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159081 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159102 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159121 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krrmz\" (UniqueName: \"kubernetes.io/projected/d131a4aa-4055-4aa9-bfef-4354654e6577-kube-api-access-krrmz\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159184 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-encryption-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159203 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a499f8e-4807-4432-810e-b240eae2b261-config\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159217 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26c7z\" (UniqueName: \"kubernetes.io/projected/ec569cec-4aef-4da7-8c8b-9a14f5565471-kube-api-access-26c7z\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159232 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159251 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbn2d\" (UniqueName: \"kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159271 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bc7f\" (UniqueName: \"kubernetes.io/projected/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-kube-api-access-7bc7f\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159287 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a21c48a-596a-409d-8021-1425828a8a76-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17e82ce0-83fd-416a-aab4-4a20524f89d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159322 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d131a4aa-4055-4aa9-bfef-4354654e6577-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159341 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db163501-b4f0-48b4-a558-e7f3d9d1c835-machine-approver-tls\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-service-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159372 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159388 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqct\" (UniqueName: \"kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-serving-cert\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159443 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec569cec-4aef-4da7-8c8b-9a14f5565471-serving-cert\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159486 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159511 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159526 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159542 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-dir\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159559 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-etcd-client\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159582 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.159703 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.160479 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.160591 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.160762 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.160866 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.163576 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.166257 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.168668 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.169911 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.170838 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-serving-cert\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.171718 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ng6vq"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.174294 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.174689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.176766 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.186160 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zthm4"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.186391 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.186454 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.190482 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cn2tr"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.190549 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qhsbc"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.191463 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tk6p6"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.197396 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.198420 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.224930 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.225051 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.225264 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.226952 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.233054 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.235762 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.235823 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j9hjt"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.235834 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.242140 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.244830 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.245706 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.246782 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kkmch"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.247916 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6zfm"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.250666 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.251686 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.252721 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-898hd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.255011 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.256198 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vcdh"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.257170 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b8wfx"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.257865 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.258245 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q28kf"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.259128 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.259441 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260153 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260193 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krrmz\" (UniqueName: \"kubernetes.io/projected/d131a4aa-4055-4aa9-bfef-4354654e6577-kube-api-access-krrmz\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260239 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a499f8e-4807-4432-810e-b240eae2b261-config\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260255 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26c7z\" (UniqueName: \"kubernetes.io/projected/ec569cec-4aef-4da7-8c8b-9a14f5565471-kube-api-access-26c7z\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260271 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d131a4aa-4055-4aa9-bfef-4354654e6577-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260286 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dx27\" (UniqueName: \"kubernetes.io/projected/dade3b21-b4f5-4558-bc6f-f63ef32cde34-kube-api-access-7dx27\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260322 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a486ed64-5363-4a9a-95e1-d83d58920673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260341 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcgbk\" (UniqueName: \"kubernetes.io/projected/5a21c48a-596a-409d-8021-1425828a8a76-kube-api-access-zcgbk\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260394 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx62q\" (UniqueName: \"kubernetes.io/projected/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-kube-api-access-zx62q\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260430 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260451 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-config\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260466 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczhw\" (UniqueName: \"kubernetes.io/projected/9a499f8e-4807-4432-810e-b240eae2b261-kube-api-access-hczhw\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260483 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-client\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260499 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498044ba-75d4-40c6-a7df-3846dc5b82d0-config\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260516 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260532 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-config\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5861cc2a-875b-459a-8940-9b16b85668a7-signing-key\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a486ed64-5363-4a9a-95e1-d83d58920673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba503e7e-7710-4be5-a871-ff39ff8b1296-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260615 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn7zl\" (UniqueName: \"kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260672 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260695 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260712 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dade3b21-b4f5-4558-bc6f-f63ef32cde34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260728 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260760 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260779 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/860c757e-a748-4689-aa5a-3414957b1d43-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260795 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgm5\" (UniqueName: \"kubernetes.io/projected/d564c64d-4755-40aa-967d-8fb49704ef10-kube-api-access-6kgm5\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260827 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht76p\" (UniqueName: \"kubernetes.io/projected/daa05454-65ea-4796-a2eb-79d127178570-kube-api-access-ht76p\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260844 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260861 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c30646-f384-458a-93b3-522895639fad-serving-cert\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260880 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-serving-cert\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260919 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260939 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhtq\" (UniqueName: \"kubernetes.io/projected/5861cc2a-875b-459a-8940-9b16b85668a7-kube-api-access-nxhtq\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260955 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260977 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.260994 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261020 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261037 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-trusted-ca\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261055 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-audit\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261072 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba503e7e-7710-4be5-a871-ff39ff8b1296-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261090 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261110 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261144 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9t58\" (UniqueName: \"kubernetes.io/projected/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-kube-api-access-k9t58\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261210 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-encryption-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261226 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938747a0-8051-4751-99a2-7b3167b23975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261244 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmhm\" (UniqueName: \"kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261263 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbn2d\" (UniqueName: \"kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261280 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv5lk\" (UniqueName: \"kubernetes.io/projected/938747a0-8051-4751-99a2-7b3167b23975-kube-api-access-gv5lk\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261295 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxp69\" (UniqueName: \"kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261320 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a21c48a-596a-409d-8021-1425828a8a76-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261338 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17e82ce0-83fd-416a-aab4-4a20524f89d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db163501-b4f0-48b4-a558-e7f3d9d1c835-machine-approver-tls\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261371 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-service-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqct\" (UniqueName: \"kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261403 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261419 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w257j\" (UniqueName: \"kubernetes.io/projected/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-kube-api-access-w257j\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261442 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec569cec-4aef-4da7-8c8b-9a14f5565471-serving-cert\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6grl\" (UniqueName: \"kubernetes.io/projected/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-kube-api-access-d6grl\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261476 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261491 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a486ed64-5363-4a9a-95e1-d83d58920673-config\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261524 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-default-certificate\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-dir\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-etcd-client\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261573 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17e82ce0-83fd-416a-aab4-4a20524f89d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261589 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926a87b0-d967-45aa-8aff-13dcbf2c98e1-serving-cert\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261606 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261639 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-serving-cert\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261657 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk2px\" (UniqueName: \"kubernetes.io/projected/46c30646-f384-458a-93b3-522895639fad-kube-api-access-zk2px\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvbq\" (UniqueName: \"kubernetes.io/projected/17e82ce0-83fd-416a-aab4-4a20524f89d4-kube-api-access-hzvbq\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261693 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgkx\" (UniqueName: \"kubernetes.io/projected/926a87b0-d967-45aa-8aff-13dcbf2c98e1-kube-api-access-tpgkx\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261727 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261745 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261761 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7gcw\" (UniqueName: \"kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261776 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-audit-dir\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261816 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c30646-f384-458a-93b3-522895639fad-config\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261834 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-encryption-config\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261851 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5vg\" (UniqueName: \"kubernetes.io/projected/860c757e-a748-4689-aa5a-3414957b1d43-kube-api-access-5x5vg\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261865 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261881 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a499f8e-4807-4432-810e-b240eae2b261-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbcp\" (UniqueName: \"kubernetes.io/projected/10cb2fd4-636c-43cc-8c36-50cffb650f27-kube-api-access-pjbcp\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261933 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261950 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-config\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs7g6\" (UniqueName: \"kubernetes.io/projected/1c1c2c63-5588-4027-90a1-b78cdbe3b10b-kube-api-access-zs7g6\") pod \"downloads-7954f5f757-ng6vq\" (UID: \"1c1c2c63-5588-4027-90a1-b78cdbe3b10b\") " pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.261985 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/498044ba-75d4-40c6-a7df-3846dc5b82d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262000 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-auth-proxy-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262019 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262034 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-config\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-policies\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-client\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262128 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262148 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba503e7e-7710-4be5-a871-ff39ff8b1296-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262171 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498044ba-75d4-40c6-a7df-3846dc5b82d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262201 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdt6\" (UniqueName: \"kubernetes.io/projected/db163501-b4f0-48b4-a558-e7f3d9d1c835-kube-api-access-4wdt6\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262224 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-serving-cert\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5vr\" (UniqueName: \"kubernetes.io/projected/5891e675-9ac7-4773-b965-63556edc89b3-kube-api-access-2s5vr\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262261 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262278 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/860c757e-a748-4689-aa5a-3414957b1d43-proxy-tls\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262293 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938747a0-8051-4751-99a2-7b3167b23975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262311 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262330 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-images\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-etcd-serving-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-image-import-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262393 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5861cc2a-875b-459a-8940-9b16b85668a7-signing-cabundle\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262396 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262439 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262408 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262528 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262578 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-images\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262622 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-tmpfs\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262651 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-node-pullsecrets\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262692 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-proxy-tls\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.262919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-node-pullsecrets\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.263119 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-service-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.263597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.263629 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q28kf"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.263835 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-service-ca\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.264163 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9ncj"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.264183 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a499f8e-4807-4432-810e-b240eae2b261-config\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.264987 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.265273 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.265676 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.266644 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-dir\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.267403 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.267466 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.267864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.268209 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.267685 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.268689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec569cec-4aef-4da7-8c8b-9a14f5565471-serving-cert\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.268766 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-config\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.269043 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a499f8e-4807-4432-810e-b240eae2b261-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.269613 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.269615 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.270180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.270229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.270677 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-etcd-client\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.270839 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-config\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.271273 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8wfx"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.271307 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.271318 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zdz9f"] Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.271579 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-auth-proxy-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.271912 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272218 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d131a4aa-4055-4aa9-bfef-4354654e6577-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272221 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272317 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272449 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272801 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cb2fd4-636c-43cc-8c36-50cffb650f27-config\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272843 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.272914 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/daa05454-65ea-4796-a2eb-79d127178570-audit-dir\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273044 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-config\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273037 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/926a87b0-d967-45aa-8aff-13dcbf2c98e1-serving-cert\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273516 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-serving-cert\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273524 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273547 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.273992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-audit-policies\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.274253 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.274281 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.274549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.274730 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.274799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17e82ce0-83fd-416a-aab4-4a20524f89d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275179 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec569cec-4aef-4da7-8c8b-9a14f5565471-trusted-ca\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-audit\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275590 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db163501-b4f0-48b4-a558-e7f3d9d1c835-config\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275834 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/5a21c48a-596a-409d-8021-1425828a8a76-images\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.275920 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-image-import-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276047 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-serving-cert\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276112 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-etcd-client\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276236 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276269 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276439 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-encryption-config\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276760 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/926a87b0-d967-45aa-8aff-13dcbf2c98e1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276964 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.276969 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/daa05454-65ea-4796-a2eb-79d127178570-etcd-serving-ca\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.277355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-encryption-config\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.278187 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/db163501-b4f0-48b4-a558-e7f3d9d1c835-machine-approver-tls\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.278376 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa05454-65ea-4796-a2eb-79d127178570-serving-cert\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.278909 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.279299 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17e82ce0-83fd-416a-aab4-4a20524f89d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.279632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a21c48a-596a-409d-8021-1425828a8a76-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.279849 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.280046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10cb2fd4-636c-43cc-8c36-50cffb650f27-etcd-client\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.284341 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.297119 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.304333 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.324344 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.334959 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.343828 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.354552 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/860c757e-a748-4689-aa5a-3414957b1d43-proxy-tls\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363327 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938747a0-8051-4751-99a2-7b3167b23975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363459 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5vr\" (UniqueName: \"kubernetes.io/projected/5891e675-9ac7-4773-b965-63556edc89b3-kube-api-access-2s5vr\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5861cc2a-875b-459a-8940-9b16b85668a7-signing-cabundle\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363532 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-images\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363688 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-tmpfs\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-proxy-tls\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363736 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363768 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dx27\" (UniqueName: \"kubernetes.io/projected/dade3b21-b4f5-4558-bc6f-f63ef32cde34-kube-api-access-7dx27\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363833 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a486ed64-5363-4a9a-95e1-d83d58920673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.363950 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364066 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5861cc2a-875b-459a-8940-9b16b85668a7-signing-key\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364162 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a486ed64-5363-4a9a-95e1-d83d58920673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364223 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364252 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dade3b21-b4f5-4558-bc6f-f63ef32cde34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/860c757e-a748-4689-aa5a-3414957b1d43-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364340 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgm5\" (UniqueName: \"kubernetes.io/projected/d564c64d-4755-40aa-967d-8fb49704ef10-kube-api-access-6kgm5\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364407 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c30646-f384-458a-93b3-522895639fad-serving-cert\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364296 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-tmpfs\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364436 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhtq\" (UniqueName: \"kubernetes.io/projected/5861cc2a-875b-459a-8940-9b16b85668a7-kube-api-access-nxhtq\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364674 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364725 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364751 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9t58\" (UniqueName: \"kubernetes.io/projected/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-kube-api-access-k9t58\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364775 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938747a0-8051-4751-99a2-7b3167b23975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364800 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmhm\" (UniqueName: \"kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364838 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv5lk\" (UniqueName: \"kubernetes.io/projected/938747a0-8051-4751-99a2-7b3167b23975-kube-api-access-gv5lk\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364866 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxp69\" (UniqueName: \"kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364777 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364926 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w257j\" (UniqueName: \"kubernetes.io/projected/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-kube-api-access-w257j\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.364991 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6grl\" (UniqueName: \"kubernetes.io/projected/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-kube-api-access-d6grl\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365027 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365053 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a486ed64-5363-4a9a-95e1-d83d58920673-config\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-default-certificate\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365109 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk2px\" (UniqueName: \"kubernetes.io/projected/46c30646-f384-458a-93b3-522895639fad-kube-api-access-zk2px\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365149 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365211 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c30646-f384-458a-93b3-522895639fad-config\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365287 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x5vg\" (UniqueName: \"kubernetes.io/projected/860c757e-a748-4689-aa5a-3414957b1d43-kube-api-access-5x5vg\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365314 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/860c757e-a748-4689-aa5a-3414957b1d43-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.365786 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.385245 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.404701 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.417643 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/498044ba-75d4-40c6-a7df-3846dc5b82d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.425267 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.445122 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.450396 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba503e7e-7710-4be5-a871-ff39ff8b1296-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.464928 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.485125 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.496467 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba503e7e-7710-4be5-a871-ff39ff8b1296-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.503959 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.509776 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/498044ba-75d4-40c6-a7df-3846dc5b82d0-config\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.545154 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.557654 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/860c757e-a748-4689-aa5a-3414957b1d43-proxy-tls\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.566633 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.584320 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.594525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-images\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.605396 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.626954 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.639344 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-proxy-tls\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.645973 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.665667 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.684744 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.704616 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.725224 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.744581 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.765990 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.786029 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.804011 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.824551 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.844546 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.859447 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5861cc2a-875b-459a-8940-9b16b85668a7-signing-key\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.865830 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.875603 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5861cc2a-875b-459a-8940-9b16b85668a7-signing-cabundle\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.887099 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.905479 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.925550 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.932482 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/dade3b21-b4f5-4558-bc6f-f63ef32cde34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.945949 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.965274 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.985309 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 06:58:26 crc kubenswrapper[4810]: I1003 06:58:26.999788 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c30646-f384-458a-93b3-522895639fad-serving-cert\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.005222 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.025024 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.026752 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c30646-f384-458a-93b3-522895639fad-config\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.044765 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.050022 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.064402 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.085586 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.107020 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.120317 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/938747a0-8051-4751-99a2-7b3167b23975-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.125168 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.142660 4810 request.go:700] Waited for 1.00228419s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.145758 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.154524 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938747a0-8051-4751-99a2-7b3167b23975-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.165163 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.185512 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.199751 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a486ed64-5363-4a9a-95e1-d83d58920673-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.206409 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.227389 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.245579 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.258279 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a486ed64-5363-4a9a-95e1-d83d58920673-config\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.265245 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.285723 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.304975 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.326189 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.345615 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.360953 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-default-certificate\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.363807 4810 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.363848 4810 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.363956 4810 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364002 4810 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364064 4810 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.363967 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume podName:5891e675-9ac7-4773-b965-63556edc89b3 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.863935739 +0000 UTC m=+141.291186484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume") pod "dns-default-f9ncj" (UID: "5891e675-9ac7-4773-b965-63556edc89b3") : failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364135 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls podName:5891e675-9ac7-4773-b965-63556edc89b3 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864100704 +0000 UTC m=+141.291351479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls") pod "dns-default-f9ncj" (UID: "5891e675-9ac7-4773-b965-63556edc89b3") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364162 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs podName:0710cdc3-8aa1-4d3b-8ab2-1ff402b20941 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864148445 +0000 UTC m=+141.291399210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs") pod "router-default-5444994796-xt6fn" (UID: "0710cdc3-8aa1-4d3b-8ab2-1ff402b20941") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364184 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle podName:0710cdc3-8aa1-4d3b-8ab2-1ff402b20941 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864173676 +0000 UTC m=+141.291424441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle") pod "router-default-5444994796-xt6fn" (UID: "0710cdc3-8aa1-4d3b-8ab2-1ff402b20941") : failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364206 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert podName:b14e69b2-09a9-4af8-b903-6b3aeb219cd8 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864194736 +0000 UTC m=+141.291445511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert") pod "packageserver-d55dfcdfc-hk6wz" (UID: "b14e69b2-09a9-4af8-b903-6b3aeb219cd8") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364247 4810 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364302 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics podName:eb0f3787-ac23-4d02-8f2d-54c41686d0ed nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864288219 +0000 UTC m=+141.291538984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics") pod "marketplace-operator-79b997595-vk2ql" (UID: "eb0f3787-ac23-4d02-8f2d-54c41686d0ed") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364542 4810 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.364624 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca podName:eb0f3787-ac23-4d02-8f2d-54c41686d0ed nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.864594357 +0000 UTC m=+141.291845282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca") pod "marketplace-operator-79b997595-vk2ql" (UID: "eb0f3787-ac23-4d02-8f2d-54c41686d0ed") : failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365403 4810 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365476 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume podName:8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.865463841 +0000 UTC m=+141.292714576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume") pod "collect-profiles-29324565-s84mz" (UID: "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a") : failed to sync configmap cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365501 4810 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365537 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls podName:d564c64d-4755-40aa-967d-8fb49704ef10 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.865528763 +0000 UTC m=+141.292779728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-jdh7s" (UID: "d564c64d-4755-40aa-967d-8fb49704ef10") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365574 4810 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365610 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert podName:b14e69b2-09a9-4af8-b903-6b3aeb219cd8 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.865601745 +0000 UTC m=+141.292852670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert") pod "packageserver-d55dfcdfc-hk6wz" (UID: "b14e69b2-09a9-4af8-b903-6b3aeb219cd8") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365642 4810 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: E1003 06:58:27.365671 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth podName:0710cdc3-8aa1-4d3b-8ab2-1ff402b20941 nodeName:}" failed. No retries permitted until 2025-10-03 06:58:27.865664137 +0000 UTC m=+141.292915122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth") pod "router-default-5444994796-xt6fn" (UID: "0710cdc3-8aa1-4d3b-8ab2-1ff402b20941") : failed to sync secret cache: timed out waiting for the condition Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.368276 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.384964 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.407474 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.425953 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.444980 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.466062 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.485631 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.506156 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.524877 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.545317 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.564991 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.584774 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.605049 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.625545 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.644791 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.674684 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.684466 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.705943 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.725805 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.769740 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trq2f\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-kube-api-access-trq2f\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.781853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhvcj\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-kube-api-access-xhvcj\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.804727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f02cd7f0-75f4-4c17-8726-7a592a95d6f9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8mqfc\" (UID: \"f02cd7f0-75f4-4c17-8726-7a592a95d6f9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.831381 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2031c6f-7934-4ab7-aefb-6c11c8fcb48f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-xqmbj\" (UID: \"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.847148 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bc7f\" (UniqueName: \"kubernetes.io/projected/3fdb5ac5-9260-4b3a-bbd6-30017ccfc191-kube-api-access-7bc7f\") pod \"openshift-config-operator-7777fb866f-vqfp6\" (UID: \"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.848586 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.863649 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.864588 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.885081 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891522 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891626 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891784 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.891834 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.892598 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5891e675-9ac7-4773-b965-63556edc89b3-config-volume\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.893298 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-service-ca-bundle\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.893377 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.893663 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.894130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.894480 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.894590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.894772 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.895558 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-metrics-certs\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.896099 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.896678 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5891e675-9ac7-4773-b965-63556edc89b3-metrics-tls\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.898014 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.898144 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-apiservice-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.899782 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-webhook-cert\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.901414 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d564c64d-4755-40aa-967d-8fb49704ef10-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.902172 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-stats-auth\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.925553 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.944493 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.965074 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 06:58:27 crc kubenswrapper[4810]: I1003 06:58:27.986467 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.005507 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.025646 4810 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.044979 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.082045 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqct\" (UniqueName: \"kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct\") pod \"console-f9d7485db-mp8wp\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.084348 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.095912 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.101918 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krrmz\" (UniqueName: \"kubernetes.io/projected/d131a4aa-4055-4aa9-bfef-4354654e6577-kube-api-access-krrmz\") pod \"cluster-samples-operator-665b6dd947-zt5t2\" (UID: \"d131a4aa-4055-4aa9-bfef-4354654e6577\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.110526 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.124596 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26c7z\" (UniqueName: \"kubernetes.io/projected/ec569cec-4aef-4da7-8c8b-9a14f5565471-kube-api-access-26c7z\") pod \"console-operator-58897d9998-cn2tr\" (UID: \"ec569cec-4aef-4da7-8c8b-9a14f5565471\") " pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.126256 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.126415 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.147653 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczhw\" (UniqueName: \"kubernetes.io/projected/9a499f8e-4807-4432-810e-b240eae2b261-kube-api-access-hczhw\") pod \"openshift-apiserver-operator-796bbdcf4f-782lq\" (UID: \"9a499f8e-4807-4432-810e-b240eae2b261\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.163019 4810 request.go:700] Waited for 1.896142179s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.165857 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcgbk\" (UniqueName: \"kubernetes.io/projected/5a21c48a-596a-409d-8021-1425828a8a76-kube-api-access-zcgbk\") pod \"machine-api-operator-5694c8668f-tk6p6\" (UID: \"5a21c48a-596a-409d-8021-1425828a8a76\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.191375 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx62q\" (UniqueName: \"kubernetes.io/projected/3a82ff2b-e6a7-494f-a4f2-d98949f88eb8-kube-api-access-zx62q\") pod \"apiserver-7bbb656c7d-npnrn\" (UID: \"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.203962 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvbq\" (UniqueName: \"kubernetes.io/projected/17e82ce0-83fd-416a-aab4-4a20524f89d4-kube-api-access-hzvbq\") pod \"openshift-controller-manager-operator-756b6f6bc6-x6rz8\" (UID: \"17e82ce0-83fd-416a-aab4-4a20524f89d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.210045 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" event={"ID":"f02cd7f0-75f4-4c17-8726-7a592a95d6f9","Type":"ContainerStarted","Data":"d5e32fa53ad6bbc64fb19cb793ff213c79cc0ac4d49409570f161a1bef64ce23"} Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.229644 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgkx\" (UniqueName: \"kubernetes.io/projected/926a87b0-d967-45aa-8aff-13dcbf2c98e1-kube-api-access-tpgkx\") pod \"authentication-operator-69f744f599-j9hjt\" (UID: \"926a87b0-d967-45aa-8aff-13dcbf2c98e1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.241287 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.252793 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.257504 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbcp\" (UniqueName: \"kubernetes.io/projected/10cb2fd4-636c-43cc-8c36-50cffb650f27-kube-api-access-pjbcp\") pod \"etcd-operator-b45778765-qhsbc\" (UID: \"10cb2fd4-636c-43cc-8c36-50cffb650f27\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.260424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht76p\" (UniqueName: \"kubernetes.io/projected/daa05454-65ea-4796-a2eb-79d127178570-kube-api-access-ht76p\") pod \"apiserver-76f77b778f-zthm4\" (UID: \"daa05454-65ea-4796-a2eb-79d127178570\") " pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.267793 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.288577 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs7g6\" (UniqueName: \"kubernetes.io/projected/1c1c2c63-5588-4027-90a1-b78cdbe3b10b-kube-api-access-zs7g6\") pod \"downloads-7954f5f757-ng6vq\" (UID: \"1c1c2c63-5588-4027-90a1-b78cdbe3b10b\") " pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.302402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/498044ba-75d4-40c6-a7df-3846dc5b82d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hb9rt\" (UID: \"498044ba-75d4-40c6-a7df-3846dc5b82d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.309187 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.325303 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.343309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba503e7e-7710-4be5-a871-ff39ff8b1296-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-s5bl4\" (UID: \"ba503e7e-7710-4be5-a871-ff39ff8b1296\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.344977 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn7zl\" (UniqueName: \"kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl\") pod \"route-controller-manager-6576b87f9c-84l77\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.346540 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.363079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7gcw\" (UniqueName: \"kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw\") pod \"controller-manager-879f6c89f-vjxst\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.365914 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.386280 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.405528 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.417709 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.432276 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.444554 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.448305 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbn2d\" (UniqueName: \"kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d\") pod \"oauth-openshift-558db77b4-xwhmw\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.452280 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.459305 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.469455 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdt6\" (UniqueName: \"kubernetes.io/projected/db163501-b4f0-48b4-a558-e7f3d9d1c835-kube-api-access-4wdt6\") pod \"machine-approver-56656f9798-f5ggg\" (UID: \"db163501-b4f0-48b4-a558-e7f3d9d1c835\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.479463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5vr\" (UniqueName: \"kubernetes.io/projected/5891e675-9ac7-4773-b965-63556edc89b3-kube-api-access-2s5vr\") pod \"dns-default-f9ncj\" (UID: \"5891e675-9ac7-4773-b965-63556edc89b3\") " pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.485222 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.500951 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dx27\" (UniqueName: \"kubernetes.io/projected/dade3b21-b4f5-4558-bc6f-f63ef32cde34-kube-api-access-7dx27\") pod \"multus-admission-controller-857f4d67dd-9vcdh\" (UID: \"dade3b21-b4f5-4558-bc6f-f63ef32cde34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.512393 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.522751 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a486ed64-5363-4a9a-95e1-d83d58920673-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fgh8h\" (UID: \"a486ed64-5363-4a9a-95e1-d83d58920673\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.533033 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.542759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgm5\" (UniqueName: \"kubernetes.io/projected/d564c64d-4755-40aa-967d-8fb49704ef10-kube-api-access-6kgm5\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdh7s\" (UID: \"d564c64d-4755-40aa-967d-8fb49704ef10\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.558445 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.564226 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9t58\" (UniqueName: \"kubernetes.io/projected/a4a72d0a-d5a5-4c91-bee0-66af15c456e7-kube-api-access-k9t58\") pod \"machine-config-operator-74547568cd-7v4l7\" (UID: \"a4a72d0a-d5a5-4c91-bee0-66af15c456e7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.565524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.569552 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" Oct 03 06:58:28 crc kubenswrapper[4810]: W1003 06:58:28.573553 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb163501_b4f0_48b4_a558_e7f3d9d1c835.slice/crio-8db009355b2a0d75813b549c80bfa487e2af98dafca9b7340580a663d1319570 WatchSource:0}: Error finding container 8db009355b2a0d75813b549c80bfa487e2af98dafca9b7340580a663d1319570: Status 404 returned error can't find the container with id 8db009355b2a0d75813b549c80bfa487e2af98dafca9b7340580a663d1319570 Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.585525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmhm\" (UniqueName: \"kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm\") pod \"collect-profiles-29324565-s84mz\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.591098 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6"] Oct 03 06:58:28 crc kubenswrapper[4810]: W1003 06:58:28.597803 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a499f8e_4807_4432_810e_b240eae2b261.slice/crio-cac0b4867feb9e7d987aaa4a8a926b2627593f832aaa8b7f1d89795395eeaecd WatchSource:0}: Error finding container cac0b4867feb9e7d987aaa4a8a926b2627593f832aaa8b7f1d89795395eeaecd: Status 404 returned error can't find the container with id cac0b4867feb9e7d987aaa4a8a926b2627593f832aaa8b7f1d89795395eeaecd Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.598477 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.610549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv5lk\" (UniqueName: \"kubernetes.io/projected/938747a0-8051-4751-99a2-7b3167b23975-kube-api-access-gv5lk\") pod \"kube-storage-version-migrator-operator-b67b599dd-6wn9g\" (UID: \"938747a0-8051-4751-99a2-7b3167b23975\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.624588 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w257j\" (UniqueName: \"kubernetes.io/projected/b14e69b2-09a9-4af8-b903-6b3aeb219cd8-kube-api-access-w257j\") pod \"packageserver-d55dfcdfc-hk6wz\" (UID: \"b14e69b2-09a9-4af8-b903-6b3aeb219cd8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.626633 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.642243 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.643549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6grl\" (UniqueName: \"kubernetes.io/projected/0710cdc3-8aa1-4d3b-8ab2-1ff402b20941-kube-api-access-d6grl\") pod \"router-default-5444994796-xt6fn\" (UID: \"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941\") " pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.660237 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.666547 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk2px\" (UniqueName: \"kubernetes.io/projected/46c30646-f384-458a-93b3-522895639fad-kube-api-access-zk2px\") pod \"service-ca-operator-777779d784-898hd\" (UID: \"46c30646-f384-458a-93b3-522895639fad\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.668547 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.678882 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.683632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxp69\" (UniqueName: \"kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69\") pod \"marketplace-operator-79b997595-vk2ql\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:28 crc kubenswrapper[4810]: W1003 06:58:28.683849 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5468dbc2_8210_4771_b950_cd96757c5788.slice/crio-dd3931bdb7ddb2b65487977640235541c03f0370d378428d91248eb1c11d69e2 WatchSource:0}: Error finding container dd3931bdb7ddb2b65487977640235541c03f0370d378428d91248eb1c11d69e2: Status 404 returned error can't find the container with id dd3931bdb7ddb2b65487977640235541c03f0370d378428d91248eb1c11d69e2 Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.699688 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-cn2tr"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.704560 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhtq\" (UniqueName: \"kubernetes.io/projected/5861cc2a-875b-459a-8940-9b16b85668a7-kube-api-access-nxhtq\") pod \"service-ca-9c57cc56f-kkmch\" (UID: \"5861cc2a-875b-459a-8940-9b16b85668a7\") " pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:28 crc kubenswrapper[4810]: W1003 06:58:28.717855 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec569cec_4aef_4da7_8c8b_9a14f5565471.slice/crio-0629737d1653cc8e4c5b6cc28f8ec0df06d913bb0f390a44e41fcc0c59708725 WatchSource:0}: Error finding container 0629737d1653cc8e4c5b6cc28f8ec0df06d913bb0f390a44e41fcc0c59708725: Status 404 returned error can't find the container with id 0629737d1653cc8e4c5b6cc28f8ec0df06d913bb0f390a44e41fcc0c59708725 Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.722019 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x5vg\" (UniqueName: \"kubernetes.io/projected/860c757e-a748-4689-aa5a-3414957b1d43-kube-api-access-5x5vg\") pod \"machine-config-controller-84d6567774-bwg2d\" (UID: \"860c757e-a748-4689-aa5a-3414957b1d43\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.759586 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.770071 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.787136 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.810633 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815513 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-srv-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815564 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-srv-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815587 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckzrw\" (UniqueName: \"kubernetes.io/projected/727e35fc-2fd0-4346-bff3-a086ef10c0c7-kube-api-access-ckzrw\") pod \"migrator-59844c95c7-dw9sp\" (UID: \"727e35fc-2fd0-4346-bff3-a086ef10c0c7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815618 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqls8\" (UniqueName: \"kubernetes.io/projected/9c54b1b0-bdda-4d3f-87ed-6a4502756708-kube-api-access-bqls8\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815693 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2467\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815712 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htpxs\" (UniqueName: \"kubernetes.io/projected/ed125176-2266-4792-a978-f73f044b2d83-kube-api-access-htpxs\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z88r9\" (UniqueName: \"kubernetes.io/projected/00918720-4227-4bee-932f-e4aa0d614be9-kube-api-access-z88r9\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815804 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.815853 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816117 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816191 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwdtc\" (UniqueName: \"kubernetes.io/projected/b785e264-9c98-4261-adbf-8f2182c2cbf7-kube-api-access-rwdtc\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816225 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b785e264-9c98-4261-adbf-8f2182c2cbf7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816258 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816277 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed125176-2266-4792-a978-f73f044b2d83-metrics-tls\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816347 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.816367 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-profile-collector-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: E1003 06:58:28.820267 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.320242443 +0000 UTC m=+142.747493398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.827107 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.827187 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.837364 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.856853 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tk6p6"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.857269 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.901219 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.919465 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.919773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.919847 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.919979 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwdtc\" (UniqueName: \"kubernetes.io/projected/b785e264-9c98-4261-adbf-8f2182c2cbf7-kube-api-access-rwdtc\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920012 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58cwr\" (UniqueName: \"kubernetes.io/projected/30567c0a-c27b-4c90-a327-2d0a427f92aa-kube-api-access-58cwr\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920039 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b785e264-9c98-4261-adbf-8f2182c2cbf7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920089 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920138 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-mountpoint-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920182 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed125176-2266-4792-a978-f73f044b2d83-metrics-tls\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920334 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-profile-collector-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-socket-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920532 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-srv-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b29hx\" (UniqueName: \"kubernetes.io/projected/5c68d14b-8d32-49b2-b007-a65ede2c357c-kube-api-access-b29hx\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920661 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-srv-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920705 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckzrw\" (UniqueName: \"kubernetes.io/projected/727e35fc-2fd0-4346-bff3-a086ef10c0c7-kube-api-access-ckzrw\") pod \"migrator-59844c95c7-dw9sp\" (UID: \"727e35fc-2fd0-4346-bff3-a086ef10c0c7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920747 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqls8\" (UniqueName: \"kubernetes.io/projected/9c54b1b0-bdda-4d3f-87ed-6a4502756708-kube-api-access-bqls8\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: E1003 06:58:28.920787 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.420745767 +0000 UTC m=+142.847996502 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920865 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920972 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-certs\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.920994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-csi-data-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921186 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2467\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921206 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htpxs\" (UniqueName: \"kubernetes.io/projected/ed125176-2266-4792-a978-f73f044b2d83-kube-api-access-htpxs\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921224 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z88r9\" (UniqueName: \"kubernetes.io/projected/00918720-4227-4bee-932f-e4aa0d614be9-kube-api-access-z88r9\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921255 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-plugins-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921302 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc4hz\" (UniqueName: \"kubernetes.io/projected/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-kube-api-access-qc4hz\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921396 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-registration-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-node-bootstrap-token\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921508 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30567c0a-c27b-4c90-a327-2d0a427f92aa-cert\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.921556 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.923124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.947433 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.955094 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.957244 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.957370 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b785e264-9c98-4261-adbf-8f2182c2cbf7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.957629 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.957727 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.958711 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.960140 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.970520 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ng6vq"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.970654 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.972375 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqls8\" (UniqueName: \"kubernetes.io/projected/9c54b1b0-bdda-4d3f-87ed-6a4502756708-kube-api-access-bqls8\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.972490 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/00918720-4227-4bee-932f-e4aa0d614be9-srv-cert\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.973797 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed125176-2266-4792-a978-f73f044b2d83-metrics-tls\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.974364 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qhsbc"] Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.975195 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-srv-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.979244 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9c54b1b0-bdda-4d3f-87ed-6a4502756708-profile-collector-cert\") pod \"catalog-operator-68c6474976-9jhmd\" (UID: \"9c54b1b0-bdda-4d3f-87ed-6a4502756708\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:28 crc kubenswrapper[4810]: I1003 06:58:28.980822 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2467\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.004334 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwdtc\" (UniqueName: \"kubernetes.io/projected/b785e264-9c98-4261-adbf-8f2182c2cbf7-kube-api-access-rwdtc\") pod \"package-server-manager-789f6589d5-9hhnx\" (UID: \"b785e264-9c98-4261-adbf-8f2182c2cbf7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-socket-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022742 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b29hx\" (UniqueName: \"kubernetes.io/projected/5c68d14b-8d32-49b2-b007-a65ede2c357c-kube-api-access-b29hx\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022781 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-certs\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022800 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-csi-data-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022843 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-plugins-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022863 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc4hz\" (UniqueName: \"kubernetes.io/projected/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-kube-api-access-qc4hz\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022922 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-registration-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-node-bootstrap-token\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.022964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30567c0a-c27b-4c90-a327-2d0a427f92aa-cert\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58cwr\" (UniqueName: \"kubernetes.io/projected/30567c0a-c27b-4c90-a327-2d0a427f92aa-kube-api-access-58cwr\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-mountpoint-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023056 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.023425 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.523410172 +0000 UTC m=+142.950660907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023805 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-socket-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-registration-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.023976 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-csi-data-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.024008 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-plugins-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.024164 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5c68d14b-8d32-49b2-b007-a65ede2c357c-mountpoint-dir\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.034100 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-certs\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.034777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-node-bootstrap-token\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.034858 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30567c0a-c27b-4c90-a327-2d0a427f92aa-cert\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.055489 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckzrw\" (UniqueName: \"kubernetes.io/projected/727e35fc-2fd0-4346-bff3-a086ef10c0c7-kube-api-access-ckzrw\") pod \"migrator-59844c95c7-dw9sp\" (UID: \"727e35fc-2fd0-4346-bff3-a086ef10c0c7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.079543 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htpxs\" (UniqueName: \"kubernetes.io/projected/ed125176-2266-4792-a978-f73f044b2d83-kube-api-access-htpxs\") pod \"dns-operator-744455d44c-c6zfm\" (UID: \"ed125176-2266-4792-a978-f73f044b2d83\") " pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.083382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z88r9\" (UniqueName: \"kubernetes.io/projected/00918720-4227-4bee-932f-e4aa0d614be9-kube-api-access-z88r9\") pod \"olm-operator-6b444d44fb-6gzbt\" (UID: \"00918720-4227-4bee-932f-e4aa0d614be9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.088234 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.104587 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.123550 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc4hz\" (UniqueName: \"kubernetes.io/projected/30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd-kube-api-access-qc4hz\") pod \"machine-config-server-zdz9f\" (UID: \"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd\") " pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.124023 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.124702 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.624623505 +0000 UTC m=+143.051874240 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.125113 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.125476 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.625467069 +0000 UTC m=+143.052717804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.143981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.149013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b29hx\" (UniqueName: \"kubernetes.io/projected/5c68d14b-8d32-49b2-b007-a65ede2c357c-kube-api-access-b29hx\") pod \"csi-hostpathplugin-q28kf\" (UID: \"5c68d14b-8d32-49b2-b007-a65ede2c357c\") " pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.159827 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58cwr\" (UniqueName: \"kubernetes.io/projected/30567c0a-c27b-4c90-a327-2d0a427f92aa-kube-api-access-58cwr\") pod \"ingress-canary-b8wfx\" (UID: \"30567c0a-c27b-4c90-a327-2d0a427f92aa\") " pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.180465 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.188963 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.226146 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.226373 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.726305333 +0000 UTC m=+143.153556068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.226357 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" event={"ID":"f02cd7f0-75f4-4c17-8726-7a592a95d6f9","Type":"ContainerStarted","Data":"16c1b31ae29308332df87c8d7b2edc27385ccc3b031e773122fe8b29113b8ad0"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.226616 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" event={"ID":"f02cd7f0-75f4-4c17-8726-7a592a95d6f9","Type":"ContainerStarted","Data":"b3908f9383ca63feffafc9259e5bac7736256289a7e1a2e4db041397027d8dd8"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.226730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.227529 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.727510745 +0000 UTC m=+143.154761480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.240042 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" event={"ID":"ec569cec-4aef-4da7-8c8b-9a14f5565471","Type":"ContainerStarted","Data":"ba1bfc2a8cdb96ab126d7d81bae516e800599fe4cf55c554dc044dd81416b9b2"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.240077 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" event={"ID":"ec569cec-4aef-4da7-8c8b-9a14f5565471","Type":"ContainerStarted","Data":"0629737d1653cc8e4c5b6cc28f8ec0df06d913bb0f390a44e41fcc0c59708725"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.240563 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.241702 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xt6fn" event={"ID":"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941","Type":"ContainerStarted","Data":"72686c4bf312ea3f73b928d5f725a673c8e62c83b614a495dad0b83f973702cc"} Oct 03 06:58:29 crc kubenswrapper[4810]: W1003 06:58:29.245058 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c1c2c63_5588_4027_90a1_b78cdbe3b10b.slice/crio-9d974973192e0685c76cfb3298396bc8e325815609d26af52112df0924de6ef0 WatchSource:0}: Error finding container 9d974973192e0685c76cfb3298396bc8e325815609d26af52112df0924de6ef0: Status 404 returned error can't find the container with id 9d974973192e0685c76cfb3298396bc8e325815609d26af52112df0924de6ef0 Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.253807 4810 patch_prober.go:28] interesting pod/console-operator-58897d9998-cn2tr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.253920 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" podUID="ec569cec-4aef-4da7-8c8b-9a14f5565471" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.256984 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" event={"ID":"9a499f8e-4807-4432-810e-b240eae2b261","Type":"ContainerStarted","Data":"9b327fa9fea589ad085d1dc386b87d1e3683b76c7b012e7d494607d83cf2c07f"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.257058 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" event={"ID":"9a499f8e-4807-4432-810e-b240eae2b261","Type":"ContainerStarted","Data":"cac0b4867feb9e7d987aaa4a8a926b2627593f832aaa8b7f1d89795395eeaecd"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.283522 4810 generic.go:334] "Generic (PLEG): container finished" podID="3fdb5ac5-9260-4b3a-bbd6-30017ccfc191" containerID="44b7f3eee6d358c4b381d4c754a96a90dbced38ff19d1a4d62e48cd312a89741" exitCode=0 Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.283985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" event={"ID":"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191","Type":"ContainerDied","Data":"44b7f3eee6d358c4b381d4c754a96a90dbced38ff19d1a4d62e48cd312a89741"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.284054 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" event={"ID":"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191","Type":"ContainerStarted","Data":"d5d1d4b72878eb1f594ccbf63a029aa853fc11a2b853ae8dba50c1b603680cc4"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.288056 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8wfx" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.313081 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.327039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" event={"ID":"db163501-b4f0-48b4-a558-e7f3d9d1c835","Type":"ContainerStarted","Data":"8db009355b2a0d75813b549c80bfa487e2af98dafca9b7340580a663d1319570"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.327312 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" event={"ID":"17e82ce0-83fd-416a-aab4-4a20524f89d4","Type":"ContainerStarted","Data":"fdd5dbf71240d700a733f40e4d8314a7db3db7418fde022054318a17760f4e8c"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.328449 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.328480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zdz9f" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.329460 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.829436039 +0000 UTC m=+143.256686774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.329558 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.331021 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.831004331 +0000 UTC m=+143.258255066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.349675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" event={"ID":"ba503e7e-7710-4be5-a871-ff39ff8b1296","Type":"ContainerStarted","Data":"7a6036038c322b03b1d0edf71446e696205856af10bdd0518b6b0f39472ba5ea"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.355421 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" event={"ID":"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8","Type":"ContainerStarted","Data":"42f93476a031451bf0356d317eca9a3fb901f3c7be68dfabc8ca2ea3b052aa9e"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.374978 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9vcdh"] Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.379780 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" event={"ID":"5a21c48a-596a-409d-8021-1425828a8a76","Type":"ContainerStarted","Data":"ca5fc0ee3f4a1c6b219fa1c45f09499dd087127fb118ba24192197a5c8db0d4f"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.388828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" event={"ID":"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f","Type":"ContainerStarted","Data":"c044a7e4b940d72fe818c80b421977609119d5a894dca8261d626e4958a1fabb"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.417768 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.432111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mp8wp" event={"ID":"5468dbc2-8210-4771-b950-cd96757c5788","Type":"ContainerStarted","Data":"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.432158 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mp8wp" event={"ID":"5468dbc2-8210-4771-b950-cd96757c5788","Type":"ContainerStarted","Data":"dd3931bdb7ddb2b65487977640235541c03f0370d378428d91248eb1c11d69e2"} Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.434991 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.435723 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:29.935703981 +0000 UTC m=+143.362954716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.453517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt"] Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.536471 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.537572 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.037546773 +0000 UTC m=+143.464797508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.646796 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.146768176 +0000 UTC m=+143.574018941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.646957 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.647446 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.647810 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.147799435 +0000 UTC m=+143.575050170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.753313 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.753757 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.253725777 +0000 UTC m=+143.680976512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.754266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.754644 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.254628872 +0000 UTC m=+143.681879607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.761782 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8mqfc" podStartSLOduration=122.761746498 podStartE2EDuration="2m2.761746498s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:29.759412643 +0000 UTC m=+143.186663388" watchObservedRunningTime="2025-10-03 06:58:29.761746498 +0000 UTC m=+143.188997233" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.832269 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" podStartSLOduration=122.832185168 podStartE2EDuration="2m2.832185168s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:29.824729834 +0000 UTC m=+143.251980579" watchObservedRunningTime="2025-10-03 06:58:29.832185168 +0000 UTC m=+143.259435923" Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.856613 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.857031 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.357008968 +0000 UTC m=+143.784259703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.923621 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz"] Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.947344 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j9hjt"] Oct 03 06:58:29 crc kubenswrapper[4810]: I1003 06:58:29.958019 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:29 crc kubenswrapper[4810]: E1003 06:58:29.958368 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.458343715 +0000 UTC m=+143.885594440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.058990 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.059152 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.559123518 +0000 UTC m=+143.986374253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.059820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.060253 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.560232088 +0000 UTC m=+143.987482823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.086936 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.111243 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.132614 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 06:58:30 crc kubenswrapper[4810]: W1003 06:58:30.158608 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14e69b2_09a9_4af8_b903_6b3aeb219cd8.slice/crio-9b4b47c7747539e3fa6b08f7f08ce6c118c8cdff4148bca09f11d76492a6a11e WatchSource:0}: Error finding container 9b4b47c7747539e3fa6b08f7f08ce6c118c8cdff4148bca09f11d76492a6a11e: Status 404 returned error can't find the container with id 9b4b47c7747539e3fa6b08f7f08ce6c118c8cdff4148bca09f11d76492a6a11e Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.170517 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.171131 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.671108657 +0000 UTC m=+144.098359392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.188937 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zthm4"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.287258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.287699 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.787681073 +0000 UTC m=+144.214931808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.378524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.385358 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f9ncj"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.390317 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.390607 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.890572512 +0000 UTC m=+144.317823247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.394497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.394548 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d"] Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.395051 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.895026284 +0000 UTC m=+144.322277019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.410340 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mp8wp" podStartSLOduration=123.410305953 podStartE2EDuration="2m3.410305953s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.399680542 +0000 UTC m=+143.826931297" watchObservedRunningTime="2025-10-03 06:58:30.410305953 +0000 UTC m=+143.837556698" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.438822 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.439844 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kkmch"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.441871 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.456056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" event={"ID":"b14e69b2-09a9-4af8-b903-6b3aeb219cd8","Type":"ContainerStarted","Data":"9b4b47c7747539e3fa6b08f7f08ce6c118c8cdff4148bca09f11d76492a6a11e"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.464870 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" event={"ID":"5a21c48a-596a-409d-8021-1425828a8a76","Type":"ContainerStarted","Data":"e5562c40eccf916179a067e707e83568cefc912272c915cba6d8acec49319221"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.493578 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.496930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zdz9f" event={"ID":"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd","Type":"ContainerStarted","Data":"175e332c2e4143ccb29a3f1ab121c76dbbd8933cf5e73035d4a68c3b1e6af547"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.496993 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zdz9f" event={"ID":"30fffcf6-c1f8-4fb4-9689-05fb42aaf2dd","Type":"ContainerStarted","Data":"c405858200a3368be3f20b3797ca4140b22c494a2f185cea78bb78fca0d78c08"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.498340 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.498962 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g"] Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.499253 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:30.99921655 +0000 UTC m=+144.426467285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.510270 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd"] Oct 03 06:58:30 crc kubenswrapper[4810]: W1003 06:58:30.512149 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd564c64d_4755_40aa_967d_8fb49704ef10.slice/crio-844d21caf6aa7f16683aab94e856c61404d480f416dfb65932d2f6c20dff4063 WatchSource:0}: Error finding container 844d21caf6aa7f16683aab94e856c61404d480f416dfb65932d2f6c20dff4063: Status 404 returned error can't find the container with id 844d21caf6aa7f16683aab94e856c61404d480f416dfb65932d2f6c20dff4063 Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.514725 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-898hd"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.545449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-q28kf"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.598405 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" event={"ID":"db163501-b4f0-48b4-a558-e7f3d9d1c835","Type":"ContainerStarted","Data":"9139dc89bfa74882870bc0a12debc139f8946084d45006ba4d221646c52c4ac5"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.602833 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.604573 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.606129 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.1061138 +0000 UTC m=+144.533364535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.615173 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8wfx"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.631024 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" event={"ID":"498044ba-75d4-40c6-a7df-3846dc5b82d0","Type":"ContainerStarted","Data":"3531fa1ac46b9b86aecba6ef7025e6bbd078fe329e005731a9ee691836268ccd"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.637640 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-782lq" podStartSLOduration=123.637617724 podStartE2EDuration="2m3.637617724s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.636715658 +0000 UTC m=+144.063966393" watchObservedRunningTime="2025-10-03 06:58:30.637617724 +0000 UTC m=+144.064868459" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.656901 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.656971 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.657851 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" event={"ID":"1023edcf-7aaf-4340-96dd-af9d1fb114ea","Type":"ContainerStarted","Data":"511213e4c58353bb7a259cf842261c32e5783db78ef8fa274a2b387baf950916"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.676334 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" event={"ID":"ba503e7e-7710-4be5-a871-ff39ff8b1296","Type":"ContainerStarted","Data":"bd64d59509c1a3720fbc07c20b42333ebfe35c06a9a581ec5c9338c144910e9a"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.691064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" event={"ID":"5bed5787-1f8e-4259-9cb9-c6edb845df27","Type":"ContainerStarted","Data":"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.691110 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" event={"ID":"5bed5787-1f8e-4259-9cb9-c6edb845df27","Type":"ContainerStarted","Data":"def23873d41b215d542f59424bb31945caf6c1fbff12b6157e99b9ee3ce504a7"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.691735 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.695390 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c6zfm"] Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.710957 4810 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-84l77 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.711041 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.712034 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.712443 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.212425283 +0000 UTC m=+144.639676018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.718239 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xt6fn" event={"ID":"0710cdc3-8aa1-4d3b-8ab2-1ff402b20941","Type":"ContainerStarted","Data":"8dd9811ad9a018a5954b4d99e88ad4e6e43b9ac28200236667ee5073c0109a45"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.753470 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" event={"ID":"dade3b21-b4f5-4558-bc6f-f63ef32cde34","Type":"ContainerStarted","Data":"ca3120a856d9e46ac0a1bcd95a679846c840fcb0c2c66ade8218e1de67b7a77d"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.762858 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" event={"ID":"926a87b0-d967-45aa-8aff-13dcbf2c98e1","Type":"ContainerStarted","Data":"924fb1fc2a240c23ee06b976c27664fa0a805343c97de25549f650bf4d394b38"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.823463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.825798 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.32578077 +0000 UTC m=+144.753031505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.830466 4810 generic.go:334] "Generic (PLEG): container finished" podID="3a82ff2b-e6a7-494f-a4f2-d98949f88eb8" containerID="2521d5a86cfe08a084d180e13a67ea19bca54a59bef3fefce2def4af11d0d3b4" exitCode=0 Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.830503 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" event={"ID":"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8","Type":"ContainerDied","Data":"2521d5a86cfe08a084d180e13a67ea19bca54a59bef3fefce2def4af11d0d3b4"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.871611 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-s5bl4" podStartSLOduration=123.871584426 podStartE2EDuration="2m3.871584426s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.865317604 +0000 UTC m=+144.292568339" watchObservedRunningTime="2025-10-03 06:58:30.871584426 +0000 UTC m=+144.298835161" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.878311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" event={"ID":"f2031c6f-7934-4ab7-aefb-6c11c8fcb48f","Type":"ContainerStarted","Data":"5aa3f8d4144498ffac9d4e2a869636e5a96c5b005244ad0b2e3971faca99cd64"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.905856 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.920099 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:30 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:30 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:30 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.920426 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.926426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.929460 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" podStartSLOduration=123.929433171 podStartE2EDuration="2m3.929433171s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.921398391 +0000 UTC m=+144.348649126" watchObservedRunningTime="2025-10-03 06:58:30.929433171 +0000 UTC m=+144.356683896" Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.942458 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" event={"ID":"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a","Type":"ContainerStarted","Data":"dc8bd39318b647148d2e2fb32ee4518c286d6239d8f1b1955296538a0b908b44"} Oct 03 06:58:30 crc kubenswrapper[4810]: I1003 06:58:30.955310 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xt6fn" podStartSLOduration=123.955092214 podStartE2EDuration="2m3.955092214s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.955234298 +0000 UTC m=+144.382485043" watchObservedRunningTime="2025-10-03 06:58:30.955092214 +0000 UTC m=+144.382342949" Oct 03 06:58:30 crc kubenswrapper[4810]: E1003 06:58:30.956288 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.455337621 +0000 UTC m=+144.882588356 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.000015 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ng6vq" event={"ID":"1c1c2c63-5588-4027-90a1-b78cdbe3b10b","Type":"ContainerStarted","Data":"bd92a3916a440e5c655e1afac796466e653b50e47ba8f02ccad32f5de16f2d73"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.000086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ng6vq" event={"ID":"1c1c2c63-5588-4027-90a1-b78cdbe3b10b","Type":"ContainerStarted","Data":"9d974973192e0685c76cfb3298396bc8e325815609d26af52112df0924de6ef0"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.001932 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.027276 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" event={"ID":"daa05454-65ea-4796-a2eb-79d127178570","Type":"ContainerStarted","Data":"f14c0d6b20e11a3465c5ee244c898c7fd943215b123697de90461dffa911d333"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.028257 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-ng6vq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.028326 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ng6vq" podUID="1c1c2c63-5588-4027-90a1-b78cdbe3b10b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.035859 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.038238 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.538206433 +0000 UTC m=+144.965457168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.044065 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zdz9f" podStartSLOduration=5.044043053 podStartE2EDuration="5.044043053s" podCreationTimestamp="2025-10-03 06:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:30.983864383 +0000 UTC m=+144.411115118" watchObservedRunningTime="2025-10-03 06:58:31.044043053 +0000 UTC m=+144.471293788" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.044330 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" podStartSLOduration=123.04432465 podStartE2EDuration="2m3.04432465s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.042611523 +0000 UTC m=+144.469862258" watchObservedRunningTime="2025-10-03 06:58:31.04432465 +0000 UTC m=+144.471575385" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.069094 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" event={"ID":"17e82ce0-83fd-416a-aab4-4a20524f89d4","Type":"ContainerStarted","Data":"350d4f7f069e71e47beaee782fe17b2f87b6f48d162d3aca463b67c108e92e70"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.095743 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" event={"ID":"3fdb5ac5-9260-4b3a-bbd6-30017ccfc191","Type":"ContainerStarted","Data":"cf1659d3992ed7da5b9df1a8b933c671d9b6d8d002125af9bf4276925fe84c5b"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.121062 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.131376 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" event={"ID":"d131a4aa-4055-4aa9-bfef-4354654e6577","Type":"ContainerStarted","Data":"035922dd826e0c02bf4eaedc785e26306e3df7507f90a36d5389ebd7e40bfb33"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.131425 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" event={"ID":"d131a4aa-4055-4aa9-bfef-4354654e6577","Type":"ContainerStarted","Data":"009e4702f2ec77011db2258d9207f164c63fe88fb716d494a1c552d8576c5625"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.139882 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.141496 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.641450312 +0000 UTC m=+145.068701047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.155809 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" podStartSLOduration=124.155787145 podStartE2EDuration="2m4.155787145s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.080548113 +0000 UTC m=+144.507798848" watchObservedRunningTime="2025-10-03 06:58:31.155787145 +0000 UTC m=+144.583037880" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.187862 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" event={"ID":"bf5629a2-a195-41d3-a776-551b79952630","Type":"ContainerStarted","Data":"bf1fcbc36fd3b09bbde16cd812b032d50afdf0efd9cbc90866291586fc8eeed7"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.188422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.217229 4810 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xwhmw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.217325 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.219643 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-xqmbj" podStartSLOduration=124.219624494 podStartE2EDuration="2m4.219624494s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.21545477 +0000 UTC m=+144.642705505" watchObservedRunningTime="2025-10-03 06:58:31.219624494 +0000 UTC m=+144.646875229" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.254323 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.264172 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.764155955 +0000 UTC m=+145.191406690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.264456 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" event={"ID":"10cb2fd4-636c-43cc-8c36-50cffb650f27","Type":"ContainerStarted","Data":"e77869a3cd537d6f68ae21fefeac2b68942b8a972d0d38751c54c982b1b86813"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.264488 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" event={"ID":"10cb2fd4-636c-43cc-8c36-50cffb650f27","Type":"ContainerStarted","Data":"758ec589c17dd27d2fb6578ae3973cbf4a5254a26714f125ffc746aa40324c61"} Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.299176 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-cn2tr" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.299424 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ng6vq" podStartSLOduration=124.299413911 podStartE2EDuration="2m4.299413911s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.296925813 +0000 UTC m=+144.724176558" watchObservedRunningTime="2025-10-03 06:58:31.299413911 +0000 UTC m=+144.726664636" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.300807 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-x6rz8" podStartSLOduration=124.300800539 podStartE2EDuration="2m4.300800539s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.264374181 +0000 UTC m=+144.691624916" watchObservedRunningTime="2025-10-03 06:58:31.300800539 +0000 UTC m=+144.728051274" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.357679 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.358041 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.858006197 +0000 UTC m=+145.285256922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.358550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.366824 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.866805538 +0000 UTC m=+145.294056273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.418305 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" podStartSLOduration=124.418285219 podStartE2EDuration="2m4.418285219s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.389043788 +0000 UTC m=+144.816294523" watchObservedRunningTime="2025-10-03 06:58:31.418285219 +0000 UTC m=+144.845535954" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.459815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.460280 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:31.96025983 +0000 UTC m=+145.387510565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.568179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.568871 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.068854007 +0000 UTC m=+145.496104742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.670858 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.674600 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.174571214 +0000 UTC m=+145.601821949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.676233 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.676585 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.176572499 +0000 UTC m=+145.603823234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.779012 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.779533 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.27950694 +0000 UTC m=+145.706757675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.862720 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qhsbc" podStartSLOduration=124.86270174 podStartE2EDuration="2m4.86270174s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.806937932 +0000 UTC m=+145.234188667" watchObservedRunningTime="2025-10-03 06:58:31.86270174 +0000 UTC m=+145.289952475" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.881964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.882513 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.382498173 +0000 UTC m=+145.809748908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.902149 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" podStartSLOduration=124.902126771 podStartE2EDuration="2m4.902126771s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.864690034 +0000 UTC m=+145.291940769" watchObservedRunningTime="2025-10-03 06:58:31.902126771 +0000 UTC m=+145.329377506" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.913640 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:31 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:31 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:31 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.914016 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.927886 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" podStartSLOduration=124.927865356 podStartE2EDuration="2m4.927865356s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.925742958 +0000 UTC m=+145.352993693" watchObservedRunningTime="2025-10-03 06:58:31.927865356 +0000 UTC m=+145.355116091" Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.983023 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.983262 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.483217273 +0000 UTC m=+145.910468018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:31 crc kubenswrapper[4810]: I1003 06:58:31.983791 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:31 crc kubenswrapper[4810]: E1003 06:58:31.984215 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.4842071 +0000 UTC m=+145.911457835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.084620 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.085435 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.585416204 +0000 UTC m=+146.012666939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.089124 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.089176 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.186985 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.187460 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.68744517 +0000 UTC m=+146.114695905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.274552 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" event={"ID":"eb0f3787-ac23-4d02-8f2d-54c41686d0ed","Type":"ContainerStarted","Data":"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.275057 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" event={"ID":"eb0f3787-ac23-4d02-8f2d-54c41686d0ed","Type":"ContainerStarted","Data":"33253e537731c78c8973ad0300a8bb8f76e92335f2b19a3560d9126135651aba"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.276430 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.278577 4810 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vk2ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.278623 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.292920 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.293394 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.793376714 +0000 UTC m=+146.220627439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.297124 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" event={"ID":"1023edcf-7aaf-4340-96dd-af9d1fb114ea","Type":"ContainerStarted","Data":"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.298320 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.304140 4810 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-vjxst container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.304177 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.325238 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hb9rt" event={"ID":"498044ba-75d4-40c6-a7df-3846dc5b82d0","Type":"ContainerStarted","Data":"30a1b2d5ee4e0e3b22376eb56a9572a65ec62fb82a81dc58d957fd3e1ef62e7c"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.336572 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9ncj" event={"ID":"5891e675-9ac7-4773-b965-63556edc89b3","Type":"ContainerStarted","Data":"8e2692ce5df9b592c4c2eba177235380e42aa590bc09c5e7ac1c4a3f85cf5427"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.336624 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9ncj" event={"ID":"5891e675-9ac7-4773-b965-63556edc89b3","Type":"ContainerStarted","Data":"b324ce83303b89838d0b5e32e81c6732417215479f8ea6297efdea2366532b99"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.355662 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" podStartSLOduration=125.355645541 podStartE2EDuration="2m5.355645541s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:31.957460347 +0000 UTC m=+145.384711082" watchObservedRunningTime="2025-10-03 06:58:32.355645541 +0000 UTC m=+145.782896276" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.357120 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" podStartSLOduration=124.357115161 podStartE2EDuration="2m4.357115161s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.353465911 +0000 UTC m=+145.780716646" watchObservedRunningTime="2025-10-03 06:58:32.357115161 +0000 UTC m=+145.784365896" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.358556 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" event={"ID":"d564c64d-4755-40aa-967d-8fb49704ef10","Type":"ContainerStarted","Data":"7be254d4b0e3286cba1883516ac8c1206148a869ab031d82c5b516343fc9fc14"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.358590 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" event={"ID":"d564c64d-4755-40aa-967d-8fb49704ef10","Type":"ContainerStarted","Data":"844d21caf6aa7f16683aab94e856c61404d480f416dfb65932d2f6c20dff4063"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.379367 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" event={"ID":"860c757e-a748-4689-aa5a-3414957b1d43","Type":"ContainerStarted","Data":"a91850045f25f0214e53cf609585b239ae411806f5feb1fd8ca89c8ccd950510"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.379431 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" event={"ID":"860c757e-a748-4689-aa5a-3414957b1d43","Type":"ContainerStarted","Data":"867a944ad599883018a76ce398fc92d9874e1626bbe22a393f4b92a42824eb12"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.379442 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" event={"ID":"860c757e-a748-4689-aa5a-3414957b1d43","Type":"ContainerStarted","Data":"c1df16e3f390ff266bb7dc361f4d7fcb87eae60b32599ddc574eaa4dd5a96dcf"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.393135 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" podStartSLOduration=125.393116657 podStartE2EDuration="2m5.393116657s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.391419291 +0000 UTC m=+145.818670026" watchObservedRunningTime="2025-10-03 06:58:32.393116657 +0000 UTC m=+145.820367392" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.395553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.396544 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:32.896531481 +0000 UTC m=+146.323782216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.402220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" event={"ID":"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a","Type":"ContainerStarted","Data":"7aab4ae837560cb6087d7b83ac07a2cee7a88c4fe20105d09db697be89895c6e"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.410258 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" event={"ID":"a486ed64-5363-4a9a-95e1-d83d58920673","Type":"ContainerStarted","Data":"2e952c74903802d45654080eaebd120eb0be5051709770dc938331909072ce91"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.410305 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" event={"ID":"a486ed64-5363-4a9a-95e1-d83d58920673","Type":"ContainerStarted","Data":"ca811b2ff549614451c2e4740cdf79b8d3f426fff55cdf7d28785a9af7309257"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.444560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j9hjt" event={"ID":"926a87b0-d967-45aa-8aff-13dcbf2c98e1","Type":"ContainerStarted","Data":"834dd4e22f29b5c2ea06c003eaac212884d0133b7f7d582c36920e9dd8c9de86"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.486265 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" event={"ID":"00918720-4227-4bee-932f-e4aa0d614be9","Type":"ContainerStarted","Data":"bcb5f4d5b8b392d355d116982bfd89bdeb5dcdf49bc2d70d69f2ae7b3833799a"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.486317 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" event={"ID":"00918720-4227-4bee-932f-e4aa0d614be9","Type":"ContainerStarted","Data":"cb4fc69ebe31256bdfa44686762d5c351b81140d791d33660179c5ed84d61b70"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.486611 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.504928 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bwg2d" podStartSLOduration=124.504902641 podStartE2EDuration="2m4.504902641s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.459512467 +0000 UTC m=+145.886763202" watchObservedRunningTime="2025-10-03 06:58:32.504902641 +0000 UTC m=+145.932153376" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.511339 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.511506 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.011479092 +0000 UTC m=+146.438729827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.511723 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.514987 4810 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-6gzbt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.515045 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" podUID="00918720-4227-4bee-932f-e4aa0d614be9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.523941 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.023879192 +0000 UTC m=+146.451129927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.553027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" event={"ID":"b785e264-9c98-4261-adbf-8f2182c2cbf7","Type":"ContainerStarted","Data":"dc039d1069e8e952e6c9cad8c0c86a3f09c329a60e6bd56c2fcdb844e9d2d568"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.553079 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" event={"ID":"b785e264-9c98-4261-adbf-8f2182c2cbf7","Type":"ContainerStarted","Data":"a7051263e7c6b7eaba637af7b193d62b73e0c5b67db1115bf9c675ebf404c5a9"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.553093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" event={"ID":"b785e264-9c98-4261-adbf-8f2182c2cbf7","Type":"ContainerStarted","Data":"87553ec2f6a3517a472d4168e320dddc76cfddf15118b22129d97b8f7b435be1"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.553220 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.571753 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdh7s" podStartSLOduration=124.571727393 podStartE2EDuration="2m4.571727393s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.506643449 +0000 UTC m=+145.933894184" watchObservedRunningTime="2025-10-03 06:58:32.571727393 +0000 UTC m=+145.998978128" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.587599 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" event={"ID":"5a21c48a-596a-409d-8021-1425828a8a76","Type":"ContainerStarted","Data":"40f088cbb8f4d7d3aa865e1aa547f8bcaff14900fca3632289b9944db9da14ca"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.596097 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" event={"ID":"bf5629a2-a195-41d3-a776-551b79952630","Type":"ContainerStarted","Data":"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.598246 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" event={"ID":"938747a0-8051-4751-99a2-7b3167b23975","Type":"ContainerStarted","Data":"8c9685212ecbb287a4d6c756b2b063f615475f5cdf51f8484b41cd6bf9c895c1"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.598272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" event={"ID":"938747a0-8051-4751-99a2-7b3167b23975","Type":"ContainerStarted","Data":"5f9f290f70abcdf6f7b57af9e44c2d559e6ceed99a9be7de2d685d4ee3baafb6"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.599776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" event={"ID":"a4a72d0a-d5a5-4c91-bee0-66af15c456e7","Type":"ContainerStarted","Data":"1dad96ef60ef7ad2b6f5223a632ae430c2605a9c836516a934de074d7358c510"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.599800 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" event={"ID":"a4a72d0a-d5a5-4c91-bee0-66af15c456e7","Type":"ContainerStarted","Data":"697faf212502ad26c054c9367dc475df4db0e599c51240f6e2da3d18a790b6af"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.599810 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" event={"ID":"a4a72d0a-d5a5-4c91-bee0-66af15c456e7","Type":"ContainerStarted","Data":"4f57634fe479bec514e7b0399194041d24000a1a6b2f9cea7543c05ba3f71506"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.600786 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" event={"ID":"5c68d14b-8d32-49b2-b007-a65ede2c357c","Type":"ContainerStarted","Data":"c8e260011bff8c30b1efd9e928c9b1e53e33d1d4e6f1c27cf05f65ca9f171dd2"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.602941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zt5t2" event={"ID":"d131a4aa-4055-4aa9-bfef-4354654e6577","Type":"ContainerStarted","Data":"a555b0d44a019748398b322739886d3b5904eb4e6ab43bd41df768b6d29464bf"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.615127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.616339 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.116319215 +0000 UTC m=+146.543569950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.616524 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" podStartSLOduration=124.61649686 podStartE2EDuration="2m4.61649686s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.580610136 +0000 UTC m=+146.007860871" watchObservedRunningTime="2025-10-03 06:58:32.61649686 +0000 UTC m=+146.043747595" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.668965 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" event={"ID":"db163501-b4f0-48b4-a558-e7f3d9d1c835","Type":"ContainerStarted","Data":"f8272b64a5ff7136080fa6dde0e7cf2bc5fa9c4052305e9cd259ace57f449394"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.683300 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fgh8h" podStartSLOduration=125.68326853 podStartE2EDuration="2m5.68326853s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.633851495 +0000 UTC m=+146.061102230" watchObservedRunningTime="2025-10-03 06:58:32.68326853 +0000 UTC m=+146.110519265" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.685362 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-6wn9g" podStartSLOduration=125.685347407 podStartE2EDuration="2m5.685347407s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.681541592 +0000 UTC m=+146.108792327" watchObservedRunningTime="2025-10-03 06:58:32.685347407 +0000 UTC m=+146.112598142" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.701677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8wfx" event={"ID":"30567c0a-c27b-4c90-a327-2d0a427f92aa","Type":"ContainerStarted","Data":"e54aa7f9f3efb8fc0b164bef53eb075f33011f05510b813887f7bb5c2d97f28d"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.701737 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8wfx" event={"ID":"30567c0a-c27b-4c90-a327-2d0a427f92aa","Type":"ContainerStarted","Data":"f7ab653cd0e127273771040c7e4814cc5bbdef8c215a2477fc88c937e1b89169"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.717185 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.717577 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.21755949 +0000 UTC m=+146.644810215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.739625 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" podStartSLOduration=124.739594474 podStartE2EDuration="2m4.739594474s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.723547623 +0000 UTC m=+146.150798358" watchObservedRunningTime="2025-10-03 06:58:32.739594474 +0000 UTC m=+146.166845209" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.749413 4810 generic.go:334] "Generic (PLEG): container finished" podID="daa05454-65ea-4796-a2eb-79d127178570" containerID="88a3097c851b61e0656fb9ebde163bb5874795f01f694af1260ff53e91ca4f74" exitCode=0 Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.749549 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" event={"ID":"daa05454-65ea-4796-a2eb-79d127178570","Type":"ContainerDied","Data":"88a3097c851b61e0656fb9ebde163bb5874795f01f694af1260ff53e91ca4f74"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.786582 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" event={"ID":"5861cc2a-875b-459a-8940-9b16b85668a7","Type":"ContainerStarted","Data":"b1b4de343b54fd97548c4ec76b8abee505f0e184cfebee5c3ea0a07d0bc95395"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.786638 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" event={"ID":"5861cc2a-875b-459a-8940-9b16b85668a7","Type":"ContainerStarted","Data":"6a9dcc7fceb98fb14a96dede922261bcbcd32cd100847ae1e7e928c82834d537"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.819540 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.821175 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.321146339 +0000 UTC m=+146.748397074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.829053 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" event={"ID":"dade3b21-b4f5-4558-bc6f-f63ef32cde34","Type":"ContainerStarted","Data":"6c86fb57c683fc04da8695a7cc10040d31651ac54d5c76ab6778431628d25c37"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.829112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" event={"ID":"dade3b21-b4f5-4558-bc6f-f63ef32cde34","Type":"ContainerStarted","Data":"d3914ad9d4866a1597af302b767b591765e10cd1c65aa571b512c373468bd39c"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.832595 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tk6p6" podStartSLOduration=124.832563941 podStartE2EDuration="2m4.832563941s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.771374374 +0000 UTC m=+146.198625099" watchObservedRunningTime="2025-10-03 06:58:32.832563941 +0000 UTC m=+146.259814676" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.858697 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kkmch" podStartSLOduration=124.858673197 podStartE2EDuration="2m4.858673197s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.858013269 +0000 UTC m=+146.285264004" watchObservedRunningTime="2025-10-03 06:58:32.858673197 +0000 UTC m=+146.285923942" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.859368 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7v4l7" podStartSLOduration=124.859362316 podStartE2EDuration="2m4.859362316s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.81609664 +0000 UTC m=+146.243347375" watchObservedRunningTime="2025-10-03 06:58:32.859362316 +0000 UTC m=+146.286613051" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.868670 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" event={"ID":"9c54b1b0-bdda-4d3f-87ed-6a4502756708","Type":"ContainerStarted","Data":"18b8a694ddd841d3d33e388b0aba350be8acaa98ec42820e3aac4f264f76f410"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.868732 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" event={"ID":"9c54b1b0-bdda-4d3f-87ed-6a4502756708","Type":"ContainerStarted","Data":"728b8db2ee29f407038fdfdb428d789ce55d6c815a9bfaf98d2700314924a2ea"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.870092 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.885235 4810 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9jhmd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.885301 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" podUID="9c54b1b0-bdda-4d3f-87ed-6a4502756708" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.889153 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f5ggg" podStartSLOduration=126.889135843 podStartE2EDuration="2m6.889135843s" podCreationTimestamp="2025-10-03 06:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.885513693 +0000 UTC m=+146.312764428" watchObservedRunningTime="2025-10-03 06:58:32.889135843 +0000 UTC m=+146.316386578" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.919102 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:32 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:32 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:32 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.919178 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.921726 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:32 crc kubenswrapper[4810]: E1003 06:58:32.923629 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.423612547 +0000 UTC m=+146.850863282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.926007 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" event={"ID":"3a82ff2b-e6a7-494f-a4f2-d98949f88eb8","Type":"ContainerStarted","Data":"66a9e00830c939175a7ff7168d2ead0ad93357713803817b0e2504624d3710b2"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.968741 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b8wfx" podStartSLOduration=6.968718823 podStartE2EDuration="6.968718823s" podCreationTimestamp="2025-10-03 06:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:32.968325542 +0000 UTC m=+146.395576277" watchObservedRunningTime="2025-10-03 06:58:32.968718823 +0000 UTC m=+146.395969558" Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.978245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" event={"ID":"b14e69b2-09a9-4af8-b903-6b3aeb219cd8","Type":"ContainerStarted","Data":"4ffed818949dc564dc9bb968cefdd7ea5008e8c2a2e603dddb884f555a775f8c"} Oct 03 06:58:32 crc kubenswrapper[4810]: I1003 06:58:32.978706 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.029921 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.030379 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.530347002 +0000 UTC m=+146.957597737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.030484 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" podStartSLOduration=125.030468566 podStartE2EDuration="2m5.030468566s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.030114186 +0000 UTC m=+146.457364931" watchObservedRunningTime="2025-10-03 06:58:33.030468566 +0000 UTC m=+146.457719301" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.030666 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.032259 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.532243385 +0000 UTC m=+146.959494110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.046154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" event={"ID":"727e35fc-2fd0-4346-bff3-a086ef10c0c7","Type":"ContainerStarted","Data":"2f31cf52b2e043dca20ad0a33cfdb1dc2be97ace2dd0328ffee39ddcb7f009d1"} Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.046210 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" event={"ID":"727e35fc-2fd0-4346-bff3-a086ef10c0c7","Type":"ContainerStarted","Data":"f8a24905bb50599676e7c4a9c3f20eb6ac9bbe0fd11045835537f324a35d6014"} Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.055043 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9vcdh" podStartSLOduration=125.055017759 podStartE2EDuration="2m5.055017759s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.052684714 +0000 UTC m=+146.479935449" watchObservedRunningTime="2025-10-03 06:58:33.055017759 +0000 UTC m=+146.482268504" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.076041 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" event={"ID":"46c30646-f384-458a-93b3-522895639fad","Type":"ContainerStarted","Data":"f9e60b6c6055a7338a5f0bb3981b1820c262df1348e6b7899e7fef8f6e418aea"} Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.076116 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" event={"ID":"46c30646-f384-458a-93b3-522895639fad","Type":"ContainerStarted","Data":"0e048bd913d83ac480a2133bac76b0fb2b4da4cdec372614196e3489a60e7374"} Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.115572 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" podStartSLOduration=125.115555087 podStartE2EDuration="2m5.115555087s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.112020931 +0000 UTC m=+146.539271666" watchObservedRunningTime="2025-10-03 06:58:33.115555087 +0000 UTC m=+146.542805822" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.120375 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" event={"ID":"ed125176-2266-4792-a978-f73f044b2d83","Type":"ContainerStarted","Data":"906bad1e50226c04513dac847b26ee056c2a739241d2071508b69b2bf55e2086"} Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.121027 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-ng6vq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.121063 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ng6vq" podUID="1c1c2c63-5588-4027-90a1-b78cdbe3b10b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.132357 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.132979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.133938 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.633919231 +0000 UTC m=+147.061169966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.178546 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vqfp6" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.189498 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" podStartSLOduration=125.189474014 podStartE2EDuration="2m5.189474014s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.157456106 +0000 UTC m=+146.584706841" watchObservedRunningTime="2025-10-03 06:58:33.189474014 +0000 UTC m=+146.616724749" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.191947 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" podStartSLOduration=125.191927261 podStartE2EDuration="2m5.191927261s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.188678712 +0000 UTC m=+146.615929447" watchObservedRunningTime="2025-10-03 06:58:33.191927261 +0000 UTC m=+146.619177996" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.228059 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-hk6wz" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.228542 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-898hd" podStartSLOduration=125.228532834 podStartE2EDuration="2m5.228532834s" podCreationTimestamp="2025-10-03 06:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.227710721 +0000 UTC m=+146.654961456" watchObservedRunningTime="2025-10-03 06:58:33.228532834 +0000 UTC m=+146.655783569" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.237201 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.257047 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.753691004 +0000 UTC m=+147.180941739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.257822 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.277657 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.293885 4810 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-npnrn container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.293985 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" podUID="3a82ff2b-e6a7-494f-a4f2-d98949f88eb8" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.295263 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" podStartSLOduration=126.295235383 podStartE2EDuration="2m6.295235383s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:33.284233361 +0000 UTC m=+146.711484116" watchObservedRunningTime="2025-10-03 06:58:33.295235383 +0000 UTC m=+146.722486108" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.341802 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.342665 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.842640132 +0000 UTC m=+147.269890867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.445687 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.446139 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:33.946124238 +0000 UTC m=+147.373374963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.547741 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.548256 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.048234926 +0000 UTC m=+147.475485661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.605618 4810 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xwhmw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.605710 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.650015 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.650597 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.150582181 +0000 UTC m=+147.577832916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.751986 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.752325 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.25230511 +0000 UTC m=+147.679555845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.854179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.854616 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.354599143 +0000 UTC m=+147.781849878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.909264 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:33 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:33 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:33 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.909398 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.963719 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.963927 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.463877568 +0000 UTC m=+147.891128303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:33 crc kubenswrapper[4810]: I1003 06:58:33.964253 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:33 crc kubenswrapper[4810]: E1003 06:58:33.964634 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.464624579 +0000 UTC m=+147.891875314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.066020 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.066190 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.566153501 +0000 UTC m=+147.993404236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.066631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.067032 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.567015255 +0000 UTC m=+147.994265990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.136463 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dw9sp" event={"ID":"727e35fc-2fd0-4346-bff3-a086ef10c0c7","Type":"ContainerStarted","Data":"345f7b1f555aadb58ab7bc0a601594c574b3712d1b981339d057ecad58411d80"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.147840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" event={"ID":"ed125176-2266-4792-a978-f73f044b2d83","Type":"ContainerStarted","Data":"6ba6e5b3810fe87ee4edbe94a2dbb9157c0bc481c69dad9c2f49277451739597"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.147912 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c6zfm" event={"ID":"ed125176-2266-4792-a978-f73f044b2d83","Type":"ContainerStarted","Data":"76b0855d55a2f6adf8a238b073de9e7b05fdf7fe0b56b59f04deeb393556a2f0"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.150716 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" event={"ID":"5c68d14b-8d32-49b2-b007-a65ede2c357c","Type":"ContainerStarted","Data":"831cef1c92430f7916ccaece46217c5c81400ecf7a6fc6a0d3f01bd3506fbcd4"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.152189 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f9ncj" event={"ID":"5891e675-9ac7-4773-b965-63556edc89b3","Type":"ContainerStarted","Data":"639edb224ecae4caa8d4cc5c08829f37908010e5b4bcc9f344ade72cc3daad34"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.152784 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.157936 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" event={"ID":"daa05454-65ea-4796-a2eb-79d127178570","Type":"ContainerStarted","Data":"e395ef7a971d00cc109e315588859a6b660db409f51cfeaec6b5d6274ebaf977"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.157960 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" event={"ID":"daa05454-65ea-4796-a2eb-79d127178570","Type":"ContainerStarted","Data":"0ae6fa3729c84c938043565bd47ae5b0278b3929c3c13c8b2f2533c7a7281b5c"} Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.158408 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-ng6vq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.158448 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ng6vq" podUID="1c1c2c63-5588-4027-90a1-b78cdbe3b10b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.161610 4810 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vk2ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.161696 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.168189 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.168616 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.668598309 +0000 UTC m=+148.095849044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.172051 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.172129 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9jhmd" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.172967 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.215330 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-6gzbt" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.247210 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f9ncj" podStartSLOduration=9.247185514 podStartE2EDuration="9.247185514s" podCreationTimestamp="2025-10-03 06:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:34.246462323 +0000 UTC m=+147.673713048" watchObservedRunningTime="2025-10-03 06:58:34.247185514 +0000 UTC m=+147.674436249" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.270179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.272018 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.771998993 +0000 UTC m=+148.199249718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.375975 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.376611 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:34.876567159 +0000 UTC m=+148.303817894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.512395 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.513114 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.01307682 +0000 UTC m=+148.440327555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.620600 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.621212 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.121191164 +0000 UTC m=+148.548441899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.645256 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" podStartSLOduration=127.645236633 podStartE2EDuration="2m7.645236633s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:34.592026255 +0000 UTC m=+148.019276990" watchObservedRunningTime="2025-10-03 06:58:34.645236633 +0000 UTC m=+148.072487368" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.722463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.722952 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.222929882 +0000 UTC m=+148.650180617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.823647 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.823921 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.323838588 +0000 UTC m=+148.751089323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.824393 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.824822 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.324805634 +0000 UTC m=+148.752056359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.866849 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.868031 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.873849 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.895939 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.906502 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:34 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:34 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:34 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.906606 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.925623 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.925786 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.425759041 +0000 UTC m=+148.853009776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.925939 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.926032 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.926099 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:34 crc kubenswrapper[4810]: I1003 06:58:34.926331 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-722nd\" (UniqueName: \"kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:34 crc kubenswrapper[4810]: E1003 06:58:34.926532 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.426508492 +0000 UTC m=+148.853759217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.027472 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.027753 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.027815 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.027918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-722nd\" (UniqueName: \"kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: E1003 06:58:35.028382 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.528363803 +0000 UTC m=+148.955614538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.028579 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.028657 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.044576 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.045558 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.047879 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.067777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-722nd\" (UniqueName: \"kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd\") pod \"certified-operators-69cqj\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.069733 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.130420 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.130493 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.130571 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grxnb\" (UniqueName: \"kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.130603 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: E1003 06:58:35.131080 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.631054478 +0000 UTC m=+149.058305203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.166531 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" event={"ID":"5c68d14b-8d32-49b2-b007-a65ede2c357c","Type":"ContainerStarted","Data":"5e7dc430863421212935ea4292ced772f3b551c7216391038c55a8791b883748"} Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.167087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" event={"ID":"5c68d14b-8d32-49b2-b007-a65ede2c357c","Type":"ContainerStarted","Data":"e9cbfc810e702f4bc9fc82da9d4f5dc7142f2b622bf474152e1cfb27610958fa"} Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.169560 4810 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vk2ql container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.169627 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.189520 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.231992 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.232233 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.232449 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grxnb\" (UniqueName: \"kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.232719 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: E1003 06:58:35.233425 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.733391073 +0000 UTC m=+149.160641808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.235428 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.236638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.247448 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.248832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.261415 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grxnb\" (UniqueName: \"kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb\") pod \"community-operators-xp4qz\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.278776 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336298 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336653 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336724 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n92k\" (UniqueName: \"kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336759 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336791 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.336848 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.337854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:35 crc kubenswrapper[4810]: E1003 06:58:35.342758 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.84273372 +0000 UTC m=+149.269984455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-59nxd" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.349383 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.358990 4810 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.364799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.365156 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.365929 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.409787 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.438008 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.438543 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.438642 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.438673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n92k\" (UniqueName: \"kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: E1003 06:58:35.439339 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 06:58:35.939306557 +0000 UTC m=+149.366557292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.439796 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.440059 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.485595 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.487309 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.489275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n92k\" (UniqueName: \"kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k\") pod \"certified-operators-xcqst\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.492823 4810 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T06:58:35.359017106Z","Handler":null,"Name":""} Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.493217 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.497938 4810 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.497977 4810 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.545990 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.546124 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gr42\" (UniqueName: \"kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.546157 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.546190 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.587341 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.587458 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.587539 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.641268 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.659266 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.660256 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gr42\" (UniqueName: \"kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.660293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.660316 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.661115 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.661589 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.698465 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gr42\" (UniqueName: \"kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42\") pod \"community-operators-8fwwl\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.761140 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-59nxd\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.812871 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.849284 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.871739 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.875396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.912178 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:35 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:35 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:35 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.912249 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:35 crc kubenswrapper[4810]: I1003 06:58:35.958099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.039336 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.309141 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" event={"ID":"5c68d14b-8d32-49b2-b007-a65ede2c357c","Type":"ContainerStarted","Data":"783bd49ba30bbe4796fb662ae84903517f4be1d4f587e06c7cc0b019be2e909a"} Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.338820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8c8d685ef67bac19dc227c0166ff43bd4da9862248384dddc839168f9cf4d77f"} Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.342359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerStarted","Data":"2b6618e410fb239402351282bc3ca407d01a6ad2bab568e7442a4774f716b7f9"} Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.352489 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerStarted","Data":"041f021fa659503d9c6536b55dce987d5ac0be73453f89360396dab0a55db3e6"} Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.410923 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-q28kf" podStartSLOduration=10.410889926 podStartE2EDuration="10.410889926s" podCreationTimestamp="2025-10-03 06:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:36.365870402 +0000 UTC m=+149.793121147" watchObservedRunningTime="2025-10-03 06:58:36.410889926 +0000 UTC m=+149.838140661" Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.623997 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.626447 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:58:36 crc kubenswrapper[4810]: W1003 06:58:36.676974 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5635f031_25c4_438c_903a_9c537b59b55c.slice/crio-c66d012182a07126ac1fa3a5722799f84f176a59e19fd5ce694129b4b7278a6b WatchSource:0}: Error finding container c66d012182a07126ac1fa3a5722799f84f176a59e19fd5ce694129b4b7278a6b: Status 404 returned error can't find the container with id c66d012182a07126ac1fa3a5722799f84f176a59e19fd5ce694129b4b7278a6b Oct 03 06:58:36 crc kubenswrapper[4810]: W1003 06:58:36.678127 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf63c741f_b306_4022_a6ec_f6dcf7d1e504.slice/crio-700981d698e5e8d116c7a59f22fa98619b614c6e3841f6a62587e2b2370c60de WatchSource:0}: Error finding container 700981d698e5e8d116c7a59f22fa98619b614c6e3841f6a62587e2b2370c60de: Status 404 returned error can't find the container with id 700981d698e5e8d116c7a59f22fa98619b614c6e3841f6a62587e2b2370c60de Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.735517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 06:58:36 crc kubenswrapper[4810]: W1003 06:58:36.755306 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53718cb5_d23a_45d2_9b8a_61fd2e8903e5.slice/crio-4f34f4ec36125c6523f90379a83a7ebe6a878479500b9b9586852048a528785b WatchSource:0}: Error finding container 4f34f4ec36125c6523f90379a83a7ebe6a878479500b9b9586852048a528785b: Status 404 returned error can't find the container with id 4f34f4ec36125c6523f90379a83a7ebe6a878479500b9b9586852048a528785b Oct 03 06:58:36 crc kubenswrapper[4810]: W1003 06:58:36.756627 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-82914505ed1ed055b154e697755a9a9b54578cee9e3a9918236311acd93a670e WatchSource:0}: Error finding container 82914505ed1ed055b154e697755a9a9b54578cee9e3a9918236311acd93a670e: Status 404 returned error can't find the container with id 82914505ed1ed055b154e697755a9a9b54578cee9e3a9918236311acd93a670e Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.906980 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:36 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:36 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:36 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:36 crc kubenswrapper[4810]: I1003 06:58:36.907128 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.040323 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.041334 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.043392 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.062456 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.109947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.110059 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.110083 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krl79\" (UniqueName: \"kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.211371 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.211433 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krl79\" (UniqueName: \"kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.211511 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.212110 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.212401 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.233799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krl79\" (UniqueName: \"kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79\") pod \"redhat-marketplace-xjng2\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.315235 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.356298 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.368597 4810 generic.go:334] "Generic (PLEG): container finished" podID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerID="b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229" exitCode=0 Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.368693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerDied","Data":"b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.369257 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerStarted","Data":"700981d698e5e8d116c7a59f22fa98619b614c6e3841f6a62587e2b2370c60de"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.375004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"19af14ecb2bc49204b910fb09a36ddbfe97b49c8d0a9cc51e14e43662c41f415"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.375063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"82914505ed1ed055b154e697755a9a9b54578cee9e3a9918236311acd93a670e"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.375792 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.380163 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.381952 4810 generic.go:334] "Generic (PLEG): container finished" podID="92710d73-0afa-4f41-b424-47adb9d5431d" containerID="b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9" exitCode=0 Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.382152 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerDied","Data":"b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.384997 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"227899bec658339409b82752b5df8383501480225bee10639671abeb978c8725"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.390150 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" event={"ID":"53718cb5-d23a-45d2-9b8a-61fd2e8903e5","Type":"ContainerStarted","Data":"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.390212 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" event={"ID":"53718cb5-d23a-45d2-9b8a-61fd2e8903e5","Type":"ContainerStarted","Data":"4f34f4ec36125c6523f90379a83a7ebe6a878479500b9b9586852048a528785b"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.390485 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.401720 4810 generic.go:334] "Generic (PLEG): container finished" podID="8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" containerID="7aab4ae837560cb6087d7b83ac07a2cee7a88c4fe20105d09db697be89895c6e" exitCode=0 Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.401816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" event={"ID":"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a","Type":"ContainerDied","Data":"7aab4ae837560cb6087d7b83ac07a2cee7a88c4fe20105d09db697be89895c6e"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.415172 4810 generic.go:334] "Generic (PLEG): container finished" podID="c93baf43-7628-4225-8fad-d2543b491649" containerID="905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2" exitCode=0 Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.415239 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerDied","Data":"905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.433027 4810 generic.go:334] "Generic (PLEG): container finished" podID="5635f031-25c4-438c-903a-9c537b59b55c" containerID="bd1c811185e3cb4482003129b47a583089a62d880e64a28f2619e1121a5c739a" exitCode=0 Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.433815 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerDied","Data":"bd1c811185e3cb4482003129b47a583089a62d880e64a28f2619e1121a5c739a"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.433857 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerStarted","Data":"c66d012182a07126ac1fa3a5722799f84f176a59e19fd5ce694129b4b7278a6b"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.445106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c15421d450926c611e195d2c3c056f46fc7766c87253dda03fe706e969d2a6bd"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.445171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"69e644ce43a7a39c70f2f3a6b755f33b6472e6f1c5b72824dfd68fe0884eae83"} Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.452856 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.505339 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.505679 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.618393 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.619024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.619240 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmdk\" (UniqueName: \"kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.630038 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.684718 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" podStartSLOduration=130.684624796 podStartE2EDuration="2m10.684624796s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:37.680368859 +0000 UTC m=+151.107619594" watchObservedRunningTime="2025-10-03 06:58:37.684624796 +0000 UTC m=+151.111875531" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.721489 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.721580 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.721623 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmdk\" (UniqueName: \"kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.722544 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.723849 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.741994 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmdk\" (UniqueName: \"kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk\") pod \"redhat-marketplace-g5275\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.824967 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.908041 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:37 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:37 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:37 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:37 crc kubenswrapper[4810]: I1003 06:58:37.908115 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.039961 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.041663 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.044008 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.051304 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.113089 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.113152 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.123202 4810 patch_prober.go:28] interesting pod/console-f9d7485db-mp8wp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.123291 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mp8wp" podUID="5468dbc2-8210-4771-b950-cd96757c5788" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.235912 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlgp\" (UniqueName: \"kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.236037 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.236067 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.271046 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.277173 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-npnrn" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.321926 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.338677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlgp\" (UniqueName: \"kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.338771 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.338796 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.339321 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.339448 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.378984 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlgp\" (UniqueName: \"kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp\") pod \"redhat-operators-9jdlp\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.414964 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.457382 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-ng6vq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.457440 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ng6vq" podUID="1c1c2c63-5588-4027-90a1-b78cdbe3b10b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.458325 4810 patch_prober.go:28] interesting pod/downloads-7954f5f757-ng6vq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" start-of-body= Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.458393 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ng6vq" podUID="1c1c2c63-5588-4027-90a1-b78cdbe3b10b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.16:8080/\": dial tcp 10.217.0.16:8080: connect: connection refused" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.472325 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.474238 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.525270 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.527039 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.535828 4810 generic.go:334] "Generic (PLEG): container finished" podID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerID="00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4" exitCode=0 Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.535923 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.536286 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.536575 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.536637 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerDied","Data":"00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4"} Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.536684 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerStarted","Data":"0a71d4d7a4356e6a673493d52c69e906e49908e70226b2e5da82c53095c87c80"} Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.541487 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerStarted","Data":"e590f9c7d1e9eecf1914f2455a77fdeaad86a8bec85db8dc982a2ba7818be82f"} Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.561256 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.561378 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.575768 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.603061 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.657591 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t6kp\" (UniqueName: \"kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.657634 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.657691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.657843 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.657923 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.759122 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t6kp\" (UniqueName: \"kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.759609 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.759662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.759751 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.759841 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.760562 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.763079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.763369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.780738 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.781442 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t6kp\" (UniqueName: \"kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp\") pod \"redhat-operators-9qvqd\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.901808 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.903482 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.906422 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:38 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:38 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:38 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.906513 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.919820 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.927360 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:38 crc kubenswrapper[4810]: I1003 06:58:38.968245 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.063415 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.072364 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwmhm\" (UniqueName: \"kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm\") pod \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.072585 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume\") pod \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.072647 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") pod \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\" (UID: \"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a\") " Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.074528 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume" (OuterVolumeSpecName: "config-volume") pod "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" (UID: "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.076833 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm" (OuterVolumeSpecName: "kube-api-access-zwmhm") pod "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" (UID: "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a"). InnerVolumeSpecName "kube-api-access-zwmhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.079497 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" (UID: "8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.083953 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.178786 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.179133 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.179144 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwmhm\" (UniqueName: \"kubernetes.io/projected/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a-kube-api-access-zwmhm\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.356161 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.359134 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 06:58:39 crc kubenswrapper[4810]: E1003 06:58:39.359361 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" containerName="collect-profiles" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.359374 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" containerName="collect-profiles" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.359506 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" containerName="collect-profiles" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.359864 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.362401 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.365376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.369852 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.411230 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.482887 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.482962 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.557163 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.557167 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz" event={"ID":"8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a","Type":"ContainerDied","Data":"dc8bd39318b647148d2e2fb32ee4518c286d6239d8f1b1955296538a0b908b44"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.557686 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8bd39318b647148d2e2fb32ee4518c286d6239d8f1b1955296538a0b908b44" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.561083 4810 generic.go:334] "Generic (PLEG): container finished" podID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerID="2233f712b12697ce6ec360ba42c356f64cef3726da86b75dd4108b4247e373e2" exitCode=0 Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.561157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerDied","Data":"2233f712b12697ce6ec360ba42c356f64cef3726da86b75dd4108b4247e373e2"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.561220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerStarted","Data":"da798c9bf0951c208e5d311c1a0db891ed775772a9b72b2df854ab7321325482"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.564246 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerStarted","Data":"e395f064a0a5a0f160e614e8b5b7a74cf153ba638a7dad615abcbcda3189a7d9"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.585591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.585679 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.586380 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.587758 4810 generic.go:334] "Generic (PLEG): container finished" podID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerID="d3f69260a2252473ea32efac0c5470a2163d7b5ec04d247720d4ada5c3b7b94a" exitCode=0 Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.587854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerDied","Data":"d3f69260a2252473ea32efac0c5470a2163d7b5ec04d247720d4ada5c3b7b94a"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.592080 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"320c6ac3-85e9-48d7-9a37-9dc36d070665","Type":"ContainerStarted","Data":"d09f8950ffcac47ea8b579f6eae4b81d1ec9f565634ac6c0ecab48e57b2a98e9"} Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.596769 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zthm4" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.614742 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.752674 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.913275 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:39 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:39 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:39 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:39 crc kubenswrapper[4810]: I1003 06:58:39.913334 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.425189 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 06:58:40 crc kubenswrapper[4810]: W1003 06:58:40.490499 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6c1bb594_4d88_457c_b763_a13d018fb939.slice/crio-02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002 WatchSource:0}: Error finding container 02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002: Status 404 returned error can't find the container with id 02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002 Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.619602 4810 generic.go:334] "Generic (PLEG): container finished" podID="4b733985-b1bd-4163-998b-6998a05076e3" containerID="ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303" exitCode=0 Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.620119 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerDied","Data":"ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303"} Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.628111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6c1bb594-4d88-457c-b763-a13d018fb939","Type":"ContainerStarted","Data":"02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002"} Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.660959 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"320c6ac3-85e9-48d7-9a37-9dc36d070665","Type":"ContainerStarted","Data":"9b0a5d2a6aee0b2382a0fe0b87f1d2cb6f6456550fc241b3444b53a4ababcc49"} Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.693670 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.693648687 podStartE2EDuration="2.693648687s" podCreationTimestamp="2025-10-03 06:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:40.693011208 +0000 UTC m=+154.120261943" watchObservedRunningTime="2025-10-03 06:58:40.693648687 +0000 UTC m=+154.120899422" Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.905603 4810 patch_prober.go:28] interesting pod/router-default-5444994796-xt6fn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 06:58:40 crc kubenswrapper[4810]: [-]has-synced failed: reason withheld Oct 03 06:58:40 crc kubenswrapper[4810]: [+]process-running ok Oct 03 06:58:40 crc kubenswrapper[4810]: healthz check failed Oct 03 06:58:40 crc kubenswrapper[4810]: I1003 06:58:40.905674 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xt6fn" podUID="0710cdc3-8aa1-4d3b-8ab2-1ff402b20941" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.670648 4810 generic.go:334] "Generic (PLEG): container finished" podID="320c6ac3-85e9-48d7-9a37-9dc36d070665" containerID="9b0a5d2a6aee0b2382a0fe0b87f1d2cb6f6456550fc241b3444b53a4ababcc49" exitCode=0 Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.671056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"320c6ac3-85e9-48d7-9a37-9dc36d070665","Type":"ContainerDied","Data":"9b0a5d2a6aee0b2382a0fe0b87f1d2cb6f6456550fc241b3444b53a4ababcc49"} Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.675073 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6c1bb594-4d88-457c-b763-a13d018fb939","Type":"ContainerStarted","Data":"5aa72d7a2a28bf20aafe434e5207439d6d48ace668c0243b144541f98e225b55"} Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.699478 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.699453213 podStartE2EDuration="2.699453213s" podCreationTimestamp="2025-10-03 06:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:58:41.697278743 +0000 UTC m=+155.124529498" watchObservedRunningTime="2025-10-03 06:58:41.699453213 +0000 UTC m=+155.126703948" Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.906226 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:41 crc kubenswrapper[4810]: I1003 06:58:41.911130 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xt6fn" Oct 03 06:58:42 crc kubenswrapper[4810]: I1003 06:58:42.686700 4810 generic.go:334] "Generic (PLEG): container finished" podID="6c1bb594-4d88-457c-b763-a13d018fb939" containerID="5aa72d7a2a28bf20aafe434e5207439d6d48ace668c0243b144541f98e225b55" exitCode=0 Oct 03 06:58:42 crc kubenswrapper[4810]: I1003 06:58:42.686767 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6c1bb594-4d88-457c-b763-a13d018fb939","Type":"ContainerDied","Data":"5aa72d7a2a28bf20aafe434e5207439d6d48ace668c0243b144541f98e225b55"} Oct 03 06:58:42 crc kubenswrapper[4810]: I1003 06:58:42.991354 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.149469 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access\") pod \"320c6ac3-85e9-48d7-9a37-9dc36d070665\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.149593 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir\") pod \"320c6ac3-85e9-48d7-9a37-9dc36d070665\" (UID: \"320c6ac3-85e9-48d7-9a37-9dc36d070665\") " Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.150053 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "320c6ac3-85e9-48d7-9a37-9dc36d070665" (UID: "320c6ac3-85e9-48d7-9a37-9dc36d070665"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.158248 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "320c6ac3-85e9-48d7-9a37-9dc36d070665" (UID: "320c6ac3-85e9-48d7-9a37-9dc36d070665"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.251356 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/320c6ac3-85e9-48d7-9a37-9dc36d070665-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.251393 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/320c6ac3-85e9-48d7-9a37-9dc36d070665-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.681823 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f9ncj" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.695468 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.696036 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"320c6ac3-85e9-48d7-9a37-9dc36d070665","Type":"ContainerDied","Data":"d09f8950ffcac47ea8b579f6eae4b81d1ec9f565634ac6c0ecab48e57b2a98e9"} Oct 03 06:58:43 crc kubenswrapper[4810]: I1003 06:58:43.696065 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09f8950ffcac47ea8b579f6eae4b81d1ec9f565634ac6c0ecab48e57b2a98e9" Oct 03 06:58:48 crc kubenswrapper[4810]: I1003 06:58:48.142232 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:48 crc kubenswrapper[4810]: I1003 06:58:48.148438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 06:58:48 crc kubenswrapper[4810]: I1003 06:58:48.428124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ng6vq" Oct 03 06:58:49 crc kubenswrapper[4810]: I1003 06:58:49.752153 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:49 crc kubenswrapper[4810]: I1003 06:58:49.763744 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/08c7f4c0-52b1-4047-9da9-c6a76b0e06e7-metrics-certs\") pod \"network-metrics-daemon-drrxm\" (UID: \"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7\") " pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:50 crc kubenswrapper[4810]: I1003 06:58:50.027744 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-drrxm" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.770470 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.778460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6c1bb594-4d88-457c-b763-a13d018fb939","Type":"ContainerDied","Data":"02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002"} Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.778512 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d1939a87fc97cbce1b1d7d7fbbf71058875eae68f3a16885eeeeba5acfa002" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.778838 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.822815 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.851764 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir\") pod \"6c1bb594-4d88-457c-b763-a13d018fb939\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.852059 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access\") pod \"6c1bb594-4d88-457c-b763-a13d018fb939\" (UID: \"6c1bb594-4d88-457c-b763-a13d018fb939\") " Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.852762 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6c1bb594-4d88-457c-b763-a13d018fb939" (UID: "6c1bb594-4d88-457c-b763-a13d018fb939"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.864015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6c1bb594-4d88-457c-b763-a13d018fb939" (UID: "6c1bb594-4d88-457c-b763-a13d018fb939"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.954324 4810 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c1bb594-4d88-457c-b763-a13d018fb939-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 06:58:55 crc kubenswrapper[4810]: I1003 06:58:55.954371 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c1bb594-4d88-457c-b763-a13d018fb939-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:02 crc kubenswrapper[4810]: I1003 06:59:02.090098 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 06:59:02 crc kubenswrapper[4810]: I1003 06:59:02.091357 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 06:59:02 crc kubenswrapper[4810]: E1003 06:59:02.120872 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 06:59:02 crc kubenswrapper[4810]: E1003 06:59:02.121128 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8gr42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8fwwl_openshift-marketplace(5635f031-25c4-438c-903a-9c537b59b55c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 06:59:02 crc kubenswrapper[4810]: E1003 06:59:02.122998 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8fwwl" podUID="5635f031-25c4-438c-903a-9c537b59b55c" Oct 03 06:59:03 crc kubenswrapper[4810]: E1003 06:59:03.436280 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8fwwl" podUID="5635f031-25c4-438c-903a-9c537b59b55c" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.927042 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.927855 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wmlgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9jdlp_openshift-marketplace(7cc3ed29-4f96-43fd-8efa-b4e07a898fdf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.932021 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9jdlp" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.949034 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.949224 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-krl79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xjng2_openshift-marketplace(3abedf38-c2d0-4d29-b0ae-216ecefd4a09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.950353 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xjng2" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.980129 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.980355 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4t6kp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9qvqd_openshift-marketplace(4b733985-b1bd-4163-998b-6998a05076e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 06:59:07 crc kubenswrapper[4810]: E1003 06:59:07.982350 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9qvqd" podUID="4b733985-b1bd-4163-998b-6998a05076e3" Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.045880 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-drrxm"] Oct 03 06:59:08 crc kubenswrapper[4810]: W1003 06:59:08.050202 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c7f4c0_52b1_4047_9da9_c6a76b0e06e7.slice/crio-7285381e8976a6481c6a59d041d214154a33950e4f92505580bc8c3bf7a50f86 WatchSource:0}: Error finding container 7285381e8976a6481c6a59d041d214154a33950e4f92505580bc8c3bf7a50f86: Status 404 returned error can't find the container with id 7285381e8976a6481c6a59d041d214154a33950e4f92505580bc8c3bf7a50f86 Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.868545 4810 generic.go:334] "Generic (PLEG): container finished" podID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerID="99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe" exitCode=0 Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.869241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerDied","Data":"99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.875339 4810 generic.go:334] "Generic (PLEG): container finished" podID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerID="ce832511c506869679569d0160242d183100c144c41c797d8c5a1ec273e83795" exitCode=0 Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.875423 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerDied","Data":"ce832511c506869679569d0160242d183100c144c41c797d8c5a1ec273e83795"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.879985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drrxm" event={"ID":"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7","Type":"ContainerStarted","Data":"0eb54609b0502f95ace7043949810733328ed2d9d700de91d7500b5d1a07ae4c"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.880049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drrxm" event={"ID":"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7","Type":"ContainerStarted","Data":"e383522b56a11b5efe72b25a9e0b85797e43712c3219bb183a38829d6ab57d17"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.880067 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-drrxm" event={"ID":"08c7f4c0-52b1-4047-9da9-c6a76b0e06e7","Type":"ContainerStarted","Data":"7285381e8976a6481c6a59d041d214154a33950e4f92505580bc8c3bf7a50f86"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.884959 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerDied","Data":"7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e"} Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.883276 4810 generic.go:334] "Generic (PLEG): container finished" podID="92710d73-0afa-4f41-b424-47adb9d5431d" containerID="7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e" exitCode=0 Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.897496 4810 generic.go:334] "Generic (PLEG): container finished" podID="c93baf43-7628-4225-8fad-d2543b491649" containerID="53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb" exitCode=0 Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.899774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerDied","Data":"53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb"} Oct 03 06:59:08 crc kubenswrapper[4810]: E1003 06:59:08.902341 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9jdlp" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" Oct 03 06:59:08 crc kubenswrapper[4810]: E1003 06:59:08.902555 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xjng2" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" Oct 03 06:59:08 crc kubenswrapper[4810]: E1003 06:59:08.906305 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9qvqd" podUID="4b733985-b1bd-4163-998b-6998a05076e3" Oct 03 06:59:08 crc kubenswrapper[4810]: I1003 06:59:08.925864 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-drrxm" podStartSLOduration=161.925838596 podStartE2EDuration="2m41.925838596s" podCreationTimestamp="2025-10-03 06:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:59:08.924874 +0000 UTC m=+182.352124755" watchObservedRunningTime="2025-10-03 06:59:08.925838596 +0000 UTC m=+182.353089331" Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.114713 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-9hhnx" Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.907067 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerStarted","Data":"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49"} Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.910336 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerStarted","Data":"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd"} Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.914359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerStarted","Data":"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5"} Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.917575 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerStarted","Data":"f0358f9dff898c888eb090f57607c6cca910939320ab12d06d187441c4eff9fe"} Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.945361 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xp4qz" podStartSLOduration=2.844992131 podStartE2EDuration="34.945326768s" podCreationTimestamp="2025-10-03 06:58:35 +0000 UTC" firstStartedPulling="2025-10-03 06:58:37.386401402 +0000 UTC m=+150.813652177" lastFinishedPulling="2025-10-03 06:59:09.486736079 +0000 UTC m=+182.913986814" observedRunningTime="2025-10-03 06:59:09.944278459 +0000 UTC m=+183.371529194" watchObservedRunningTime="2025-10-03 06:59:09.945326768 +0000 UTC m=+183.372577553" Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.966992 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-69cqj" podStartSLOduration=3.9066391510000003 podStartE2EDuration="35.966965971s" podCreationTimestamp="2025-10-03 06:58:34 +0000 UTC" firstStartedPulling="2025-10-03 06:58:37.432076784 +0000 UTC m=+150.859327519" lastFinishedPulling="2025-10-03 06:59:09.492403604 +0000 UTC m=+182.919654339" observedRunningTime="2025-10-03 06:59:09.965696436 +0000 UTC m=+183.392947231" watchObservedRunningTime="2025-10-03 06:59:09.966965971 +0000 UTC m=+183.394216706" Oct 03 06:59:09 crc kubenswrapper[4810]: I1003 06:59:09.993316 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xcqst" podStartSLOduration=3.11018812 podStartE2EDuration="34.993290942s" podCreationTimestamp="2025-10-03 06:58:35 +0000 UTC" firstStartedPulling="2025-10-03 06:58:37.379822742 +0000 UTC m=+150.807073477" lastFinishedPulling="2025-10-03 06:59:09.262925564 +0000 UTC m=+182.690176299" observedRunningTime="2025-10-03 06:59:09.991059651 +0000 UTC m=+183.418310386" watchObservedRunningTime="2025-10-03 06:59:09.993290942 +0000 UTC m=+183.420541697" Oct 03 06:59:10 crc kubenswrapper[4810]: I1003 06:59:10.022464 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g5275" podStartSLOduration=3.278595981 podStartE2EDuration="33.022447722s" podCreationTimestamp="2025-10-03 06:58:37 +0000 UTC" firstStartedPulling="2025-10-03 06:58:39.591100588 +0000 UTC m=+153.018351323" lastFinishedPulling="2025-10-03 06:59:09.334952339 +0000 UTC m=+182.762203064" observedRunningTime="2025-10-03 06:59:10.020408015 +0000 UTC m=+183.447658750" watchObservedRunningTime="2025-10-03 06:59:10.022447722 +0000 UTC m=+183.449698447" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.190260 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.191274 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.410598 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.410691 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.588488 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.588665 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.676370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 06:59:15 crc kubenswrapper[4810]: I1003 06:59:15.999324 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:16 crc kubenswrapper[4810]: I1003 06:59:15.999979 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:16 crc kubenswrapper[4810]: I1003 06:59:16.002888 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:16 crc kubenswrapper[4810]: I1003 06:59:16.058667 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:16 crc kubenswrapper[4810]: I1003 06:59:16.060175 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:16 crc kubenswrapper[4810]: I1003 06:59:16.064514 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:17 crc kubenswrapper[4810]: I1003 06:59:17.825082 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:17 crc kubenswrapper[4810]: I1003 06:59:17.825158 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:17 crc kubenswrapper[4810]: I1003 06:59:17.869389 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:18 crc kubenswrapper[4810]: I1003 06:59:18.028221 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:18 crc kubenswrapper[4810]: I1003 06:59:18.251758 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:59:18 crc kubenswrapper[4810]: I1003 06:59:18.252128 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xcqst" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="registry-server" containerID="cri-o://a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5" gracePeriod=2 Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.818936 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.942550 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content\") pod \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.942784 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n92k\" (UniqueName: \"kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k\") pod \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.942822 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities\") pod \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\" (UID: \"f63c741f-b306-4022-a6ec-f6dcf7d1e504\") " Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.943675 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities" (OuterVolumeSpecName: "utilities") pod "f63c741f-b306-4022-a6ec-f6dcf7d1e504" (UID: "f63c741f-b306-4022-a6ec-f6dcf7d1e504"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.952752 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k" (OuterVolumeSpecName: "kube-api-access-9n92k") pod "f63c741f-b306-4022-a6ec-f6dcf7d1e504" (UID: "f63c741f-b306-4022-a6ec-f6dcf7d1e504"). InnerVolumeSpecName "kube-api-access-9n92k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.982367 4810 generic.go:334] "Generic (PLEG): container finished" podID="5635f031-25c4-438c-903a-9c537b59b55c" containerID="ef188f3f7bec6b024017846bfd1169c05eafda8ee0438e1a5d64d5f5471b4d10" exitCode=0 Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.982447 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerDied","Data":"ef188f3f7bec6b024017846bfd1169c05eafda8ee0438e1a5d64d5f5471b4d10"} Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.986669 4810 generic.go:334] "Generic (PLEG): container finished" podID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerID="a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5" exitCode=0 Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.986695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerDied","Data":"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5"} Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.986711 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xcqst" event={"ID":"f63c741f-b306-4022-a6ec-f6dcf7d1e504","Type":"ContainerDied","Data":"700981d698e5e8d116c7a59f22fa98619b614c6e3841f6a62587e2b2370c60de"} Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.986730 4810 scope.go:117] "RemoveContainer" containerID="a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5" Oct 03 06:59:19 crc kubenswrapper[4810]: I1003 06:59:19.986978 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xcqst" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.021831 4810 scope.go:117] "RemoveContainer" containerID="99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.038561 4810 scope.go:117] "RemoveContainer" containerID="b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.043532 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63c741f-b306-4022-a6ec-f6dcf7d1e504" (UID: "f63c741f-b306-4022-a6ec-f6dcf7d1e504"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.045315 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.045336 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n92k\" (UniqueName: \"kubernetes.io/projected/f63c741f-b306-4022-a6ec-f6dcf7d1e504-kube-api-access-9n92k\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.045350 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63c741f-b306-4022-a6ec-f6dcf7d1e504-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.059727 4810 scope.go:117] "RemoveContainer" containerID="a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5" Oct 03 06:59:20 crc kubenswrapper[4810]: E1003 06:59:20.060254 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5\": container with ID starting with a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5 not found: ID does not exist" containerID="a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.060291 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5"} err="failed to get container status \"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5\": rpc error: code = NotFound desc = could not find container \"a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5\": container with ID starting with a135a7e0ab9e15d9485bf5224ae5fef932e451c030da2695b65ef07fcbdde3e5 not found: ID does not exist" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.060338 4810 scope.go:117] "RemoveContainer" containerID="99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe" Oct 03 06:59:20 crc kubenswrapper[4810]: E1003 06:59:20.060843 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe\": container with ID starting with 99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe not found: ID does not exist" containerID="99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.061064 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe"} err="failed to get container status \"99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe\": rpc error: code = NotFound desc = could not find container \"99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe\": container with ID starting with 99d8fab11df07b9a336077665220c72d213a5918bb58e16f790f3f296a2435fe not found: ID does not exist" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.061114 4810 scope.go:117] "RemoveContainer" containerID="b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229" Oct 03 06:59:20 crc kubenswrapper[4810]: E1003 06:59:20.061432 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229\": container with ID starting with b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229 not found: ID does not exist" containerID="b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.061462 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229"} err="failed to get container status \"b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229\": rpc error: code = NotFound desc = could not find container \"b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229\": container with ID starting with b152e9ce41f639595487be65fd7565ddb1be8b431ecdf37a9d0957de72465229 not found: ID does not exist" Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.318707 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.322107 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xcqst"] Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.653185 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:59:20 crc kubenswrapper[4810]: I1003 06:59:20.654072 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g5275" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="registry-server" containerID="cri-o://f0358f9dff898c888eb090f57607c6cca910939320ab12d06d187441c4eff9fe" gracePeriod=2 Oct 03 06:59:21 crc kubenswrapper[4810]: I1003 06:59:21.316819 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" path="/var/lib/kubelet/pods/f63c741f-b306-4022-a6ec-f6dcf7d1e504/volumes" Oct 03 06:59:22 crc kubenswrapper[4810]: I1003 06:59:22.008955 4810 generic.go:334] "Generic (PLEG): container finished" podID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerID="f0358f9dff898c888eb090f57607c6cca910939320ab12d06d187441c4eff9fe" exitCode=0 Oct 03 06:59:22 crc kubenswrapper[4810]: I1003 06:59:22.008976 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerDied","Data":"f0358f9dff898c888eb090f57607c6cca910939320ab12d06d187441c4eff9fe"} Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.241141 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.294120 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities\") pod \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.294164 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdmdk\" (UniqueName: \"kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk\") pod \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.294304 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content\") pod \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\" (UID: \"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa\") " Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.294981 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities" (OuterVolumeSpecName: "utilities") pod "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" (UID: "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.300031 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk" (OuterVolumeSpecName: "kube-api-access-gdmdk") pod "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" (UID: "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa"). InnerVolumeSpecName "kube-api-access-gdmdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.313618 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" (UID: "f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.395962 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.396005 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:23 crc kubenswrapper[4810]: I1003 06:59:23.396017 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdmdk\" (UniqueName: \"kubernetes.io/projected/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa-kube-api-access-gdmdk\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.040318 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g5275" event={"ID":"f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa","Type":"ContainerDied","Data":"e590f9c7d1e9eecf1914f2455a77fdeaad86a8bec85db8dc982a2ba7818be82f"} Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.040724 4810 scope.go:117] "RemoveContainer" containerID="f0358f9dff898c888eb090f57607c6cca910939320ab12d06d187441c4eff9fe" Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.040978 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g5275" Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.065198 4810 scope.go:117] "RemoveContainer" containerID="ce832511c506869679569d0160242d183100c144c41c797d8c5a1ec273e83795" Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.084932 4810 scope.go:117] "RemoveContainer" containerID="d3f69260a2252473ea32efac0c5470a2163d7b5ec04d247720d4ada5c3b7b94a" Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.085174 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:59:24 crc kubenswrapper[4810]: I1003 06:59:24.092222 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g5275"] Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.061934 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerStarted","Data":"f4a0f1b25978d38847f6fcef923d0faa061bb1d0f04390be03f62f960a9a7d13"} Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.063843 4810 generic.go:334] "Generic (PLEG): container finished" podID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerID="55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263" exitCode=0 Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.063917 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerDied","Data":"55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263"} Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.067868 4810 generic.go:334] "Generic (PLEG): container finished" podID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerID="62b2f79bc046e11609e05699eaeae6ecc9ef958243a086d1f166cc7c86c818ff" exitCode=0 Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.067943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerDied","Data":"62b2f79bc046e11609e05699eaeae6ecc9ef958243a086d1f166cc7c86c818ff"} Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.071751 4810 generic.go:334] "Generic (PLEG): container finished" podID="4b733985-b1bd-4163-998b-6998a05076e3" containerID="afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a" exitCode=0 Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.071816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerDied","Data":"afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a"} Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.084408 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8fwwl" podStartSLOduration=3.643784777 podStartE2EDuration="50.084386792s" podCreationTimestamp="2025-10-03 06:58:35 +0000 UTC" firstStartedPulling="2025-10-03 06:58:37.435195889 +0000 UTC m=+150.862446634" lastFinishedPulling="2025-10-03 06:59:23.875797904 +0000 UTC m=+197.303048649" observedRunningTime="2025-10-03 06:59:25.078463102 +0000 UTC m=+198.505713837" watchObservedRunningTime="2025-10-03 06:59:25.084386792 +0000 UTC m=+198.511637527" Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.316458 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" path="/var/lib/kubelet/pods/f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa/volumes" Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.873265 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:25 crc kubenswrapper[4810]: I1003 06:59:25.873959 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.083921 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerStarted","Data":"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19"} Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.086926 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerStarted","Data":"6cc250e75597b2c66f0992cc60200196021650b7a24f0ec4824d3cbac04a492e"} Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.089565 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerStarted","Data":"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302"} Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.105515 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xjng2" podStartSLOduration=1.944785225 podStartE2EDuration="49.105497956s" podCreationTimestamp="2025-10-03 06:58:37 +0000 UTC" firstStartedPulling="2025-10-03 06:58:38.537422419 +0000 UTC m=+151.964673154" lastFinishedPulling="2025-10-03 06:59:25.69813515 +0000 UTC m=+199.125385885" observedRunningTime="2025-10-03 06:59:26.103372875 +0000 UTC m=+199.530623610" watchObservedRunningTime="2025-10-03 06:59:26.105497956 +0000 UTC m=+199.532748691" Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.118179 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9qvqd" podStartSLOduration=3.213202298 podStartE2EDuration="48.118173622s" podCreationTimestamp="2025-10-03 06:58:38 +0000 UTC" firstStartedPulling="2025-10-03 06:58:40.622163557 +0000 UTC m=+154.049414292" lastFinishedPulling="2025-10-03 06:59:25.527134891 +0000 UTC m=+198.954385616" observedRunningTime="2025-10-03 06:59:26.117612756 +0000 UTC m=+199.544880412" watchObservedRunningTime="2025-10-03 06:59:26.118173622 +0000 UTC m=+199.545424347" Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.144066 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9jdlp" podStartSLOduration=2.133995601 podStartE2EDuration="48.14404353s" podCreationTimestamp="2025-10-03 06:58:38 +0000 UTC" firstStartedPulling="2025-10-03 06:58:39.562489114 +0000 UTC m=+152.989739849" lastFinishedPulling="2025-10-03 06:59:25.572537043 +0000 UTC m=+198.999787778" observedRunningTime="2025-10-03 06:59:26.138075568 +0000 UTC m=+199.565326293" watchObservedRunningTime="2025-10-03 06:59:26.14404353 +0000 UTC m=+199.571294265" Oct 03 06:59:26 crc kubenswrapper[4810]: I1003 06:59:26.913963 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8fwwl" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="registry-server" probeResult="failure" output=< Oct 03 06:59:26 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 06:59:26 crc kubenswrapper[4810]: > Oct 03 06:59:27 crc kubenswrapper[4810]: I1003 06:59:27.357398 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:27 crc kubenswrapper[4810]: I1003 06:59:27.357472 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:27 crc kubenswrapper[4810]: I1003 06:59:27.433113 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:28 crc kubenswrapper[4810]: I1003 06:59:28.420096 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:28 crc kubenswrapper[4810]: I1003 06:59:28.456714 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:28 crc kubenswrapper[4810]: I1003 06:59:28.904040 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:28 crc kubenswrapper[4810]: I1003 06:59:28.904078 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:29 crc kubenswrapper[4810]: I1003 06:59:29.464187 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9jdlp" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="registry-server" probeResult="failure" output=< Oct 03 06:59:29 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 06:59:29 crc kubenswrapper[4810]: > Oct 03 06:59:29 crc kubenswrapper[4810]: I1003 06:59:29.946864 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9qvqd" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="registry-server" probeResult="failure" output=< Oct 03 06:59:29 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 06:59:29 crc kubenswrapper[4810]: > Oct 03 06:59:32 crc kubenswrapper[4810]: I1003 06:59:32.089168 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 06:59:32 crc kubenswrapper[4810]: I1003 06:59:32.089585 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 06:59:32 crc kubenswrapper[4810]: I1003 06:59:32.089692 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 06:59:32 crc kubenswrapper[4810]: I1003 06:59:32.090338 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 06:59:32 crc kubenswrapper[4810]: I1003 06:59:32.090396 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6" gracePeriod=600 Oct 03 06:59:33 crc kubenswrapper[4810]: I1003 06:59:33.129660 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6" exitCode=0 Oct 03 06:59:33 crc kubenswrapper[4810]: I1003 06:59:33.129805 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6"} Oct 03 06:59:33 crc kubenswrapper[4810]: I1003 06:59:33.130203 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5"} Oct 03 06:59:35 crc kubenswrapper[4810]: I1003 06:59:35.922479 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:35 crc kubenswrapper[4810]: I1003 06:59:35.988689 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:36 crc kubenswrapper[4810]: I1003 06:59:36.474847 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:59:37 crc kubenswrapper[4810]: I1003 06:59:37.152916 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8fwwl" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="registry-server" containerID="cri-o://f4a0f1b25978d38847f6fcef923d0faa061bb1d0f04390be03f62f960a9a7d13" gracePeriod=2 Oct 03 06:59:37 crc kubenswrapper[4810]: I1003 06:59:37.401435 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.164538 4810 generic.go:334] "Generic (PLEG): container finished" podID="5635f031-25c4-438c-903a-9c537b59b55c" containerID="f4a0f1b25978d38847f6fcef923d0faa061bb1d0f04390be03f62f960a9a7d13" exitCode=0 Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.164655 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerDied","Data":"f4a0f1b25978d38847f6fcef923d0faa061bb1d0f04390be03f62f960a9a7d13"} Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.465678 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.521811 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.608076 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.700326 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities\") pod \"5635f031-25c4-438c-903a-9c537b59b55c\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.700418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gr42\" (UniqueName: \"kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42\") pod \"5635f031-25c4-438c-903a-9c537b59b55c\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.700496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content\") pod \"5635f031-25c4-438c-903a-9c537b59b55c\" (UID: \"5635f031-25c4-438c-903a-9c537b59b55c\") " Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.702111 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities" (OuterVolumeSpecName: "utilities") pod "5635f031-25c4-438c-903a-9c537b59b55c" (UID: "5635f031-25c4-438c-903a-9c537b59b55c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.710051 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42" (OuterVolumeSpecName: "kube-api-access-8gr42") pod "5635f031-25c4-438c-903a-9c537b59b55c" (UID: "5635f031-25c4-438c-903a-9c537b59b55c"). InnerVolumeSpecName "kube-api-access-8gr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.758835 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5635f031-25c4-438c-903a-9c537b59b55c" (UID: "5635f031-25c4-438c-903a-9c537b59b55c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.802635 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.802666 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gr42\" (UniqueName: \"kubernetes.io/projected/5635f031-25c4-438c-903a-9c537b59b55c-kube-api-access-8gr42\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.802676 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5635f031-25c4-438c-903a-9c537b59b55c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.951276 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:38 crc kubenswrapper[4810]: I1003 06:59:38.996876 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.175518 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8fwwl" Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.185558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8fwwl" event={"ID":"5635f031-25c4-438c-903a-9c537b59b55c","Type":"ContainerDied","Data":"c66d012182a07126ac1fa3a5722799f84f176a59e19fd5ce694129b4b7278a6b"} Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.185612 4810 scope.go:117] "RemoveContainer" containerID="f4a0f1b25978d38847f6fcef923d0faa061bb1d0f04390be03f62f960a9a7d13" Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.213341 4810 scope.go:117] "RemoveContainer" containerID="ef188f3f7bec6b024017846bfd1169c05eafda8ee0438e1a5d64d5f5471b4d10" Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.218316 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.222597 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8fwwl"] Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.235333 4810 scope.go:117] "RemoveContainer" containerID="bd1c811185e3cb4482003129b47a583089a62d880e64a28f2619e1121a5c739a" Oct 03 06:59:39 crc kubenswrapper[4810]: I1003 06:59:39.309757 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5635f031-25c4-438c-903a-9c537b59b55c" path="/var/lib/kubelet/pods/5635f031-25c4-438c-903a-9c537b59b55c/volumes" Oct 03 06:59:41 crc kubenswrapper[4810]: I1003 06:59:41.671618 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:59:41 crc kubenswrapper[4810]: I1003 06:59:41.672676 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9qvqd" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="registry-server" containerID="cri-o://f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302" gracePeriod=2 Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.033850 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.047509 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities\") pod \"4b733985-b1bd-4163-998b-6998a05076e3\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.047594 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t6kp\" (UniqueName: \"kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp\") pod \"4b733985-b1bd-4163-998b-6998a05076e3\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.047640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content\") pod \"4b733985-b1bd-4163-998b-6998a05076e3\" (UID: \"4b733985-b1bd-4163-998b-6998a05076e3\") " Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.048680 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities" (OuterVolumeSpecName: "utilities") pod "4b733985-b1bd-4163-998b-6998a05076e3" (UID: "4b733985-b1bd-4163-998b-6998a05076e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.058169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp" (OuterVolumeSpecName: "kube-api-access-4t6kp") pod "4b733985-b1bd-4163-998b-6998a05076e3" (UID: "4b733985-b1bd-4163-998b-6998a05076e3"). InnerVolumeSpecName "kube-api-access-4t6kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.132061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b733985-b1bd-4163-998b-6998a05076e3" (UID: "4b733985-b1bd-4163-998b-6998a05076e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.148746 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.148778 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t6kp\" (UniqueName: \"kubernetes.io/projected/4b733985-b1bd-4163-998b-6998a05076e3-kube-api-access-4t6kp\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.148790 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b733985-b1bd-4163-998b-6998a05076e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.193195 4810 generic.go:334] "Generic (PLEG): container finished" podID="4b733985-b1bd-4163-998b-6998a05076e3" containerID="f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302" exitCode=0 Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.193261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerDied","Data":"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302"} Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.193304 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qvqd" event={"ID":"4b733985-b1bd-4163-998b-6998a05076e3","Type":"ContainerDied","Data":"e395f064a0a5a0f160e614e8b5b7a74cf153ba638a7dad615abcbcda3189a7d9"} Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.193327 4810 scope.go:117] "RemoveContainer" containerID="f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.193267 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qvqd" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.214102 4810 scope.go:117] "RemoveContainer" containerID="afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.223559 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.230396 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9qvqd"] Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.249511 4810 scope.go:117] "RemoveContainer" containerID="ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.265427 4810 scope.go:117] "RemoveContainer" containerID="f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.265881 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302\": container with ID starting with f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302 not found: ID does not exist" containerID="f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.265942 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302"} err="failed to get container status \"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302\": rpc error: code = NotFound desc = could not find container \"f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302\": container with ID starting with f25ee06243e79daca07f41e4607b08649fde811b7d7355b4d506211a98544302 not found: ID does not exist" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.265976 4810 scope.go:117] "RemoveContainer" containerID="afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.266333 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a\": container with ID starting with afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a not found: ID does not exist" containerID="afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.266379 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a"} err="failed to get container status \"afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a\": rpc error: code = NotFound desc = could not find container \"afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a\": container with ID starting with afef8a342e8e15986b608e7a83709fe47917ac4681bc69d1e6fd7e22f713a44a not found: ID does not exist" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.266412 4810 scope.go:117] "RemoveContainer" containerID="ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.266810 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303\": container with ID starting with ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303 not found: ID does not exist" containerID="ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.266846 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303"} err="failed to get container status \"ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303\": rpc error: code = NotFound desc = could not find container \"ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303\": container with ID starting with ce69ffbf9da6d4af2e3ec708c234d9b9471e5077f02f86f0fce50d8d6de81303 not found: ID does not exist" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352095 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p9mbq"] Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352359 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352376 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352388 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352397 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352409 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352417 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352429 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352436 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352446 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352453 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352463 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352473 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352484 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352491 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352498 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352505 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352513 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352519 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352528 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352535 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352549 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c6ac3-85e9-48d7-9a37-9dc36d070665" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352556 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c6ac3-85e9-48d7-9a37-9dc36d070665" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352571 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352578 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="extract-utilities" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352586 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c1bb594-4d88-457c-b763-a13d018fb939" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352595 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1bb594-4d88-457c-b763-a13d018fb939" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: E1003 06:59:42.352606 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352614 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="extract-content" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352733 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b733985-b1bd-4163-998b-6998a05076e3" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352749 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5635f031-25c4-438c-903a-9c537b59b55c" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352759 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63c741f-b306-4022-a6ec-f6dcf7d1e504" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352773 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="320c6ac3-85e9-48d7-9a37-9dc36d070665" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352784 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c1bb594-4d88-457c-b763-a13d018fb939" containerName="pruner" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.352794 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f666c630-3dc8-4fa6-8fa8-75bc0b76e2aa" containerName="registry-server" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.353302 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.381637 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p9mbq"] Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.452825 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-tls\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.452909 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-bound-sa-token\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.452990 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-trusted-ca\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.453016 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-certificates\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.453084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.453138 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef7f840-17b5-4d91-ab11-54b31cdbee3a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.453157 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2nm\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-kube-api-access-6f2nm\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.453257 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef7f840-17b5-4d91-ab11-54b31cdbee3a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.476084 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554363 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef7f840-17b5-4d91-ab11-54b31cdbee3a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554451 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2nm\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-kube-api-access-6f2nm\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554491 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef7f840-17b5-4d91-ab11-54b31cdbee3a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-tls\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554574 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-bound-sa-token\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554639 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-trusted-ca\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.554678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-certificates\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.555385 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fef7f840-17b5-4d91-ab11-54b31cdbee3a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.555878 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-trusted-ca\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.556769 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-certificates\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.559965 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-registry-tls\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.561151 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fef7f840-17b5-4d91-ab11-54b31cdbee3a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.570326 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-bound-sa-token\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.571884 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2nm\" (UniqueName: \"kubernetes.io/projected/fef7f840-17b5-4d91-ab11-54b31cdbee3a-kube-api-access-6f2nm\") pod \"image-registry-66df7c8f76-p9mbq\" (UID: \"fef7f840-17b5-4d91-ab11-54b31cdbee3a\") " pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.669141 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:42 crc kubenswrapper[4810]: I1003 06:59:42.892509 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p9mbq"] Oct 03 06:59:42 crc kubenswrapper[4810]: W1003 06:59:42.902743 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfef7f840_17b5_4d91_ab11_54b31cdbee3a.slice/crio-0c69d88e8e3478546e6572b0c2b0d3f316ae92f1c3751b4620f9ef8e953ab31c WatchSource:0}: Error finding container 0c69d88e8e3478546e6572b0c2b0d3f316ae92f1c3751b4620f9ef8e953ab31c: Status 404 returned error can't find the container with id 0c69d88e8e3478546e6572b0c2b0d3f316ae92f1c3751b4620f9ef8e953ab31c Oct 03 06:59:43 crc kubenswrapper[4810]: I1003 06:59:43.204168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" event={"ID":"fef7f840-17b5-4d91-ab11-54b31cdbee3a","Type":"ContainerStarted","Data":"0d3fd3db614915329e731713c5b9e6eee1d6874f5e981b3d6ea6f76d8e55cf1f"} Oct 03 06:59:43 crc kubenswrapper[4810]: I1003 06:59:43.204231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" event={"ID":"fef7f840-17b5-4d91-ab11-54b31cdbee3a","Type":"ContainerStarted","Data":"0c69d88e8e3478546e6572b0c2b0d3f316ae92f1c3751b4620f9ef8e953ab31c"} Oct 03 06:59:43 crc kubenswrapper[4810]: I1003 06:59:43.205733 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 06:59:43 crc kubenswrapper[4810]: I1003 06:59:43.310116 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b733985-b1bd-4163-998b-6998a05076e3" path="/var/lib/kubelet/pods/4b733985-b1bd-4163-998b-6998a05076e3/volumes" Oct 03 06:59:47 crc kubenswrapper[4810]: I1003 06:59:47.657768 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" podStartSLOduration=5.657749973 podStartE2EDuration="5.657749973s" podCreationTimestamp="2025-10-03 06:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:59:43.229353883 +0000 UTC m=+216.656604618" watchObservedRunningTime="2025-10-03 06:59:47.657749973 +0000 UTC m=+221.085000708" Oct 03 06:59:47 crc kubenswrapper[4810]: I1003 06:59:47.661121 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.599353 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.600462 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-69cqj" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="registry-server" containerID="cri-o://7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd" gracePeriod=30 Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.627325 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.627642 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xp4qz" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="registry-server" containerID="cri-o://0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49" gracePeriod=30 Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.637570 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.637864 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" containerID="cri-o://44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62" gracePeriod=30 Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.665030 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.665409 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xjng2" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="registry-server" containerID="cri-o://2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19" gracePeriod=30 Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.671189 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mxqpd"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.672685 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.679408 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.679705 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9jdlp" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="registry-server" containerID="cri-o://6cc250e75597b2c66f0992cc60200196021650b7a24f0ec4824d3cbac04a492e" gracePeriod=30 Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.691647 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mxqpd"] Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.762627 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dcvf\" (UniqueName: \"kubernetes.io/projected/68a22413-3bbe-475d-a98e-09c99becb176-kube-api-access-7dcvf\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.762684 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.762734 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: E1003 06:59:49.771994 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92710d73_0afa_4f41_b424_47adb9d5431d.slice/crio-0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92710d73_0afa_4f41_b424_47adb9d5431d.slice/crio-conmon-0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49.scope\": RecentStats: unable to find data in memory cache]" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.863785 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.864372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dcvf\" (UniqueName: \"kubernetes.io/projected/68a22413-3bbe-475d-a98e-09c99becb176-kube-api-access-7dcvf\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.864420 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.866820 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.876004 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68a22413-3bbe-475d-a98e-09c99becb176-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.895549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dcvf\" (UniqueName: \"kubernetes.io/projected/68a22413-3bbe-475d-a98e-09c99becb176-kube-api-access-7dcvf\") pod \"marketplace-operator-79b997595-mxqpd\" (UID: \"68a22413-3bbe-475d-a98e-09c99becb176\") " pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:49 crc kubenswrapper[4810]: I1003 06:59:49.987643 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.066525 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.116285 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.144164 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.171276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grxnb\" (UniqueName: \"kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb\") pod \"92710d73-0afa-4f41-b424-47adb9d5431d\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.171329 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content\") pod \"92710d73-0afa-4f41-b424-47adb9d5431d\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.171375 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities\") pod \"92710d73-0afa-4f41-b424-47adb9d5431d\" (UID: \"92710d73-0afa-4f41-b424-47adb9d5431d\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.172549 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities" (OuterVolumeSpecName: "utilities") pod "92710d73-0afa-4f41-b424-47adb9d5431d" (UID: "92710d73-0afa-4f41-b424-47adb9d5431d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.173858 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.178058 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb" (OuterVolumeSpecName: "kube-api-access-grxnb") pod "92710d73-0afa-4f41-b424-47adb9d5431d" (UID: "92710d73-0afa-4f41-b424-47adb9d5431d"). InnerVolumeSpecName "kube-api-access-grxnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.231657 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92710d73-0afa-4f41-b424-47adb9d5431d" (UID: "92710d73-0afa-4f41-b424-47adb9d5431d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272320 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities\") pod \"c93baf43-7628-4225-8fad-d2543b491649\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272370 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-722nd\" (UniqueName: \"kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd\") pod \"c93baf43-7628-4225-8fad-d2543b491649\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") pod \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content\") pod \"c93baf43-7628-4225-8fad-d2543b491649\" (UID: \"c93baf43-7628-4225-8fad-d2543b491649\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krl79\" (UniqueName: \"kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79\") pod \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272514 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities\") pod \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272562 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") pod \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content\") pod \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\" (UID: \"3abedf38-c2d0-4d29-b0ae-216ecefd4a09\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272630 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxp69\" (UniqueName: \"kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69\") pod \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\" (UID: \"eb0f3787-ac23-4d02-8f2d-54c41686d0ed\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.272995 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.273018 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grxnb\" (UniqueName: \"kubernetes.io/projected/92710d73-0afa-4f41-b424-47adb9d5431d-kube-api-access-grxnb\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.273031 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92710d73-0afa-4f41-b424-47adb9d5431d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.278032 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "eb0f3787-ac23-4d02-8f2d-54c41686d0ed" (UID: "eb0f3787-ac23-4d02-8f2d-54c41686d0ed"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.278615 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities" (OuterVolumeSpecName: "utilities") pod "c93baf43-7628-4225-8fad-d2543b491649" (UID: "c93baf43-7628-4225-8fad-d2543b491649"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.285042 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79" (OuterVolumeSpecName: "kube-api-access-krl79") pod "3abedf38-c2d0-4d29-b0ae-216ecefd4a09" (UID: "3abedf38-c2d0-4d29-b0ae-216ecefd4a09"). InnerVolumeSpecName "kube-api-access-krl79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.292078 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd" (OuterVolumeSpecName: "kube-api-access-722nd") pod "c93baf43-7628-4225-8fad-d2543b491649" (UID: "c93baf43-7628-4225-8fad-d2543b491649"). InnerVolumeSpecName "kube-api-access-722nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.292392 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "eb0f3787-ac23-4d02-8f2d-54c41686d0ed" (UID: "eb0f3787-ac23-4d02-8f2d-54c41686d0ed"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.300104 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69" (OuterVolumeSpecName: "kube-api-access-nxp69") pod "eb0f3787-ac23-4d02-8f2d-54c41686d0ed" (UID: "eb0f3787-ac23-4d02-8f2d-54c41686d0ed"). InnerVolumeSpecName "kube-api-access-nxp69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.306945 4810 generic.go:334] "Generic (PLEG): container finished" podID="92710d73-0afa-4f41-b424-47adb9d5431d" containerID="0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49" exitCode=0 Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.307062 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerDied","Data":"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.307093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xp4qz" event={"ID":"92710d73-0afa-4f41-b424-47adb9d5431d","Type":"ContainerDied","Data":"2b6618e410fb239402351282bc3ca407d01a6ad2bab568e7442a4774f716b7f9"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.307112 4810 scope.go:117] "RemoveContainer" containerID="0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.307224 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xp4qz" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.312157 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3abedf38-c2d0-4d29-b0ae-216ecefd4a09" (UID: "3abedf38-c2d0-4d29-b0ae-216ecefd4a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.315983 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities" (OuterVolumeSpecName: "utilities") pod "3abedf38-c2d0-4d29-b0ae-216ecefd4a09" (UID: "3abedf38-c2d0-4d29-b0ae-216ecefd4a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.329799 4810 generic.go:334] "Generic (PLEG): container finished" podID="c93baf43-7628-4225-8fad-d2543b491649" containerID="7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd" exitCode=0 Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.329884 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerDied","Data":"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.329940 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69cqj" event={"ID":"c93baf43-7628-4225-8fad-d2543b491649","Type":"ContainerDied","Data":"041f021fa659503d9c6536b55dce987d5ac0be73453f89360396dab0a55db3e6"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.330027 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69cqj" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.341132 4810 generic.go:334] "Generic (PLEG): container finished" podID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerID="2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19" exitCode=0 Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.341199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerDied","Data":"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.341232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xjng2" event={"ID":"3abedf38-c2d0-4d29-b0ae-216ecefd4a09","Type":"ContainerDied","Data":"0a71d4d7a4356e6a673493d52c69e906e49908e70226b2e5da82c53095c87c80"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.341299 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xjng2" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.344205 4810 scope.go:117] "RemoveContainer" containerID="7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.354179 4810 generic.go:334] "Generic (PLEG): container finished" podID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerID="6cc250e75597b2c66f0992cc60200196021650b7a24f0ec4824d3cbac04a492e" exitCode=0 Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.354247 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerDied","Data":"6cc250e75597b2c66f0992cc60200196021650b7a24f0ec4824d3cbac04a492e"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.355628 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.357874 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xp4qz"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.365005 4810 scope.go:117] "RemoveContainer" containerID="b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.367212 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c93baf43-7628-4225-8fad-d2543b491649" (UID: "c93baf43-7628-4225-8fad-d2543b491649"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.367344 4810 generic.go:334] "Generic (PLEG): container finished" podID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerID="44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62" exitCode=0 Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.367376 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" event={"ID":"eb0f3787-ac23-4d02-8f2d-54c41686d0ed","Type":"ContainerDied","Data":"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.367404 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" event={"ID":"eb0f3787-ac23-4d02-8f2d-54c41686d0ed","Type":"ContainerDied","Data":"33253e537731c78c8973ad0300a8bb8f76e92335f2b19a3560d9126135651aba"} Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.367474 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vk2ql" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379469 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379502 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krl79\" (UniqueName: \"kubernetes.io/projected/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-kube-api-access-krl79\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379513 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379525 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379536 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abedf38-c2d0-4d29-b0ae-216ecefd4a09-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379546 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxp69\" (UniqueName: \"kubernetes.io/projected/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-kube-api-access-nxp69\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379554 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c93baf43-7628-4225-8fad-d2543b491649-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379562 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-722nd\" (UniqueName: \"kubernetes.io/projected/c93baf43-7628-4225-8fad-d2543b491649-kube-api-access-722nd\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.379572 4810 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb0f3787-ac23-4d02-8f2d-54c41686d0ed-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.384085 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.391359 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xjng2"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.409839 4810 scope.go:117] "RemoveContainer" containerID="0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.410911 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49\": container with ID starting with 0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49 not found: ID does not exist" containerID="0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.411091 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49"} err="failed to get container status \"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49\": rpc error: code = NotFound desc = could not find container \"0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49\": container with ID starting with 0d3be8b8afd3da57ccd75b82a02e8c7c7ec8b3762d0226bc2acb76b8a9912f49 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.411170 4810 scope.go:117] "RemoveContainer" containerID="7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.411640 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e\": container with ID starting with 7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e not found: ID does not exist" containerID="7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.411672 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e"} err="failed to get container status \"7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e\": rpc error: code = NotFound desc = could not find container \"7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e\": container with ID starting with 7673ed43b000778877e3e7c8dace6855d82162650498e9b613eed78f9d34cd2e not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.411689 4810 scope.go:117] "RemoveContainer" containerID="b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.412273 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9\": container with ID starting with b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9 not found: ID does not exist" containerID="b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.412303 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9"} err="failed to get container status \"b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9\": rpc error: code = NotFound desc = could not find container \"b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9\": container with ID starting with b9c29f499e7f3b0d8d52211a9779c9dea96514b3fb6d72bf6d9681e3917554f9 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.412318 4810 scope.go:117] "RemoveContainer" containerID="7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.421100 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.424008 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vk2ql"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.430258 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mxqpd"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.434473 4810 scope.go:117] "RemoveContainer" containerID="53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.455579 4810 scope.go:117] "RemoveContainer" containerID="905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.476344 4810 scope.go:117] "RemoveContainer" containerID="7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.476788 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd\": container with ID starting with 7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd not found: ID does not exist" containerID="7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.476927 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd"} err="failed to get container status \"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd\": rpc error: code = NotFound desc = could not find container \"7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd\": container with ID starting with 7dd0f1104f255a79d0952d25b27e686c4262f1189dc07781acb741d1e72505cd not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.477023 4810 scope.go:117] "RemoveContainer" containerID="53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.477454 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb\": container with ID starting with 53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb not found: ID does not exist" containerID="53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.477485 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb"} err="failed to get container status \"53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb\": rpc error: code = NotFound desc = could not find container \"53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb\": container with ID starting with 53348fd69e3aac9e446757f41c177625b411ef51cdc66788f71309a6cf2ab0fb not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.477511 4810 scope.go:117] "RemoveContainer" containerID="905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.477817 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2\": container with ID starting with 905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2 not found: ID does not exist" containerID="905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.477865 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2"} err="failed to get container status \"905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2\": rpc error: code = NotFound desc = could not find container \"905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2\": container with ID starting with 905df59ac68bee58bf41194670cab4844454c55e83c53f271f12197a9ede97e2 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.477911 4810 scope.go:117] "RemoveContainer" containerID="2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.495213 4810 scope.go:117] "RemoveContainer" containerID="55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.528414 4810 scope.go:117] "RemoveContainer" containerID="00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.556970 4810 scope.go:117] "RemoveContainer" containerID="2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.557547 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19\": container with ID starting with 2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19 not found: ID does not exist" containerID="2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.557589 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19"} err="failed to get container status \"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19\": rpc error: code = NotFound desc = could not find container \"2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19\": container with ID starting with 2e933777eac02b60c9eb9906eaf1daaf15b21daf44f8868e59596043914e5c19 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.557619 4810 scope.go:117] "RemoveContainer" containerID="55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.558527 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263\": container with ID starting with 55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263 not found: ID does not exist" containerID="55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.558619 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263"} err="failed to get container status \"55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263\": rpc error: code = NotFound desc = could not find container \"55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263\": container with ID starting with 55119be2a9fab4363cae296af625e8aa971b29c7e80a24c58e333bcd1a2ef263 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.558636 4810 scope.go:117] "RemoveContainer" containerID="00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.559018 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4\": container with ID starting with 00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4 not found: ID does not exist" containerID="00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.559038 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4"} err="failed to get container status \"00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4\": rpc error: code = NotFound desc = could not find container \"00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4\": container with ID starting with 00b4628d24eefafd8bcdfe028f387ed52b6e3679f835f1ba87da0fc29ad506c4 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.559052 4810 scope.go:117] "RemoveContainer" containerID="44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.588027 4810 scope.go:117] "RemoveContainer" containerID="44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62" Oct 03 06:59:50 crc kubenswrapper[4810]: E1003 06:59:50.588791 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62\": container with ID starting with 44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62 not found: ID does not exist" containerID="44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.588852 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62"} err="failed to get container status \"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62\": rpc error: code = NotFound desc = could not find container \"44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62\": container with ID starting with 44e0782a52f4193dd180b47e7da0b86c86aa19b48f537e83dd94f83687fb7c62 not found: ID does not exist" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.666237 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.673362 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-69cqj"] Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.765568 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.888029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlgp\" (UniqueName: \"kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp\") pod \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.888548 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities\") pod \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.888640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content\") pod \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\" (UID: \"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf\") " Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.889473 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities" (OuterVolumeSpecName: "utilities") pod "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" (UID: "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.896485 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp" (OuterVolumeSpecName: "kube-api-access-wmlgp") pod "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" (UID: "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf"). InnerVolumeSpecName "kube-api-access-wmlgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.989859 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlgp\" (UniqueName: \"kubernetes.io/projected/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-kube-api-access-wmlgp\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.990121 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:50 crc kubenswrapper[4810]: I1003 06:59:50.989175 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" (UID: "7cc3ed29-4f96-43fd-8efa-b4e07a898fdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.091389 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.309797 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" path="/var/lib/kubelet/pods/3abedf38-c2d0-4d29-b0ae-216ecefd4a09/volumes" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.311116 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" path="/var/lib/kubelet/pods/92710d73-0afa-4f41-b424-47adb9d5431d/volumes" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.312372 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93baf43-7628-4225-8fad-d2543b491649" path="/var/lib/kubelet/pods/c93baf43-7628-4225-8fad-d2543b491649/volumes" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.314616 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" path="/var/lib/kubelet/pods/eb0f3787-ac23-4d02-8f2d-54c41686d0ed/volumes" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.378857 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9jdlp" event={"ID":"7cc3ed29-4f96-43fd-8efa-b4e07a898fdf","Type":"ContainerDied","Data":"da798c9bf0951c208e5d311c1a0db891ed775772a9b72b2df854ab7321325482"} Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.378929 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9jdlp" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.378948 4810 scope.go:117] "RemoveContainer" containerID="6cc250e75597b2c66f0992cc60200196021650b7a24f0ec4824d3cbac04a492e" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.383755 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" event={"ID":"68a22413-3bbe-475d-a98e-09c99becb176","Type":"ContainerStarted","Data":"f644e463d96e625278521c384a6af6e6eb9796247aa3e28d8f1cbac437b58f97"} Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.383820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" event={"ID":"68a22413-3bbe-475d-a98e-09c99becb176","Type":"ContainerStarted","Data":"672a71a3dbd2c8857b0814839ab49a2484c5912bbc71aea5d90f8690014958e2"} Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.384389 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.388180 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.404553 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mxqpd" podStartSLOduration=2.404520555 podStartE2EDuration="2.404520555s" podCreationTimestamp="2025-10-03 06:59:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 06:59:51.397478342 +0000 UTC m=+224.824729107" watchObservedRunningTime="2025-10-03 06:59:51.404520555 +0000 UTC m=+224.831771300" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.406346 4810 scope.go:117] "RemoveContainer" containerID="62b2f79bc046e11609e05699eaeae6ecc9ef958243a086d1f166cc7c86c818ff" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.437431 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.448729 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9jdlp"] Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.460769 4810 scope.go:117] "RemoveContainer" containerID="2233f712b12697ce6ec360ba42c356f64cef3726da86b75dd4108b4247e373e2" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.822882 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9w6dm"] Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823134 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823148 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823157 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823165 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823176 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823183 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823190 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823196 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823209 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823215 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823225 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823231 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823238 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823243 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823250 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823255 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="extract-content" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823263 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823268 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823277 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823283 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823292 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823297 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823305 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823311 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="extract-utilities" Oct 03 06:59:51 crc kubenswrapper[4810]: E1003 06:59:51.823318 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823325 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823414 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823425 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93baf43-7628-4225-8fad-d2543b491649" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823434 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="92710d73-0afa-4f41-b424-47adb9d5431d" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823443 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0f3787-ac23-4d02-8f2d-54c41686d0ed" containerName="marketplace-operator" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.823451 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abedf38-c2d0-4d29-b0ae-216ecefd4a09" containerName="registry-server" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.824133 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.826058 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.836521 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w6dm"] Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.902277 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2xn\" (UniqueName: \"kubernetes.io/projected/c238f470-c89c-4992-8bb7-ad1c2cd58553-kube-api-access-5h2xn\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.902369 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-utilities\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:51 crc kubenswrapper[4810]: I1003 06:59:51.902428 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-catalog-content\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.003542 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2xn\" (UniqueName: \"kubernetes.io/projected/c238f470-c89c-4992-8bb7-ad1c2cd58553-kube-api-access-5h2xn\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.003746 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-utilities\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.003862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-catalog-content\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.004689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-utilities\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.004734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c238f470-c89c-4992-8bb7-ad1c2cd58553-catalog-content\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.026167 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.027210 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.031156 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.032379 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2xn\" (UniqueName: \"kubernetes.io/projected/c238f470-c89c-4992-8bb7-ad1c2cd58553-kube-api-access-5h2xn\") pod \"redhat-marketplace-9w6dm\" (UID: \"c238f470-c89c-4992-8bb7-ad1c2cd58553\") " pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.041585 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.105612 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.105692 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpv9s\" (UniqueName: \"kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.105743 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.141030 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.207139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpv9s\" (UniqueName: \"kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.207559 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.207606 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.208289 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.208883 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.233118 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpv9s\" (UniqueName: \"kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s\") pod \"certified-operators-mmvkz\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.359476 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.570820 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9w6dm"] Oct 03 06:59:52 crc kubenswrapper[4810]: I1003 06:59:52.574776 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.311114 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc3ed29-4f96-43fd-8efa-b4e07a898fdf" path="/var/lib/kubelet/pods/7cc3ed29-4f96-43fd-8efa-b4e07a898fdf/volumes" Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.400882 4810 generic.go:334] "Generic (PLEG): container finished" podID="c238f470-c89c-4992-8bb7-ad1c2cd58553" containerID="7103b80ffb0735868290f04105e715273a202e021fa011598fbe8557367be74f" exitCode=0 Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.403503 4810 generic.go:334] "Generic (PLEG): container finished" podID="0a41a867-b179-4665-b79f-de759a045f82" containerID="92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229" exitCode=0 Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.400943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w6dm" event={"ID":"c238f470-c89c-4992-8bb7-ad1c2cd58553","Type":"ContainerDied","Data":"7103b80ffb0735868290f04105e715273a202e021fa011598fbe8557367be74f"} Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.404036 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w6dm" event={"ID":"c238f470-c89c-4992-8bb7-ad1c2cd58553","Type":"ContainerStarted","Data":"6c1f26607749ca5638313888bb7ca0278c791e591fcf967389cd5a2be0c6f902"} Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.404058 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerDied","Data":"92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229"} Oct 03 06:59:53 crc kubenswrapper[4810]: I1003 06:59:53.404073 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerStarted","Data":"d45ccaaee75f8d623d012a748930b68d0f48aa06813830245229fa8591620395"} Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.221927 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qfw9w"] Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.223478 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.228411 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.233636 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfw9w"] Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.355346 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-utilities\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.356150 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gnm\" (UniqueName: \"kubernetes.io/projected/276bc9fb-9280-4ec8-8f8a-32f482040f97-kube-api-access-99gnm\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.356401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-catalog-content\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.417543 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerStarted","Data":"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781"} Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.421185 4810 generic.go:334] "Generic (PLEG): container finished" podID="c238f470-c89c-4992-8bb7-ad1c2cd58553" containerID="9fe9a1457f756225878a418c5eca5b3aff2ca268609b4baccd0b1036e1c86e34" exitCode=0 Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.421250 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w6dm" event={"ID":"c238f470-c89c-4992-8bb7-ad1c2cd58553","Type":"ContainerDied","Data":"9fe9a1457f756225878a418c5eca5b3aff2ca268609b4baccd0b1036e1c86e34"} Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.456235 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.458411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-catalog-content\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.458509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-utilities\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.458581 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99gnm\" (UniqueName: \"kubernetes.io/projected/276bc9fb-9280-4ec8-8f8a-32f482040f97-kube-api-access-99gnm\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.459147 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-utilities\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.460563 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/276bc9fb-9280-4ec8-8f8a-32f482040f97-catalog-content\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.467982 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.472401 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.472477 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.493374 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gnm\" (UniqueName: \"kubernetes.io/projected/276bc9fb-9280-4ec8-8f8a-32f482040f97-kube-api-access-99gnm\") pod \"redhat-operators-qfw9w\" (UID: \"276bc9fb-9280-4ec8-8f8a-32f482040f97\") " pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.560857 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6dfp\" (UniqueName: \"kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.561059 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.561140 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.583106 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.662641 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.662721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.662764 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6dfp\" (UniqueName: \"kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.664263 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.664492 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.707077 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6dfp\" (UniqueName: \"kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp\") pod \"community-operators-s8hjs\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.823722 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 06:59:54 crc kubenswrapper[4810]: I1003 06:59:54.843254 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qfw9w"] Oct 03 06:59:54 crc kubenswrapper[4810]: W1003 06:59:54.857124 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276bc9fb_9280_4ec8_8f8a_32f482040f97.slice/crio-5a24a976550c6c2202fa1068efcc5a68900e0b92a69d1dc562647aaea672157f WatchSource:0}: Error finding container 5a24a976550c6c2202fa1068efcc5a68900e0b92a69d1dc562647aaea672157f: Status 404 returned error can't find the container with id 5a24a976550c6c2202fa1068efcc5a68900e0b92a69d1dc562647aaea672157f Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.023489 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 06:59:55 crc kubenswrapper[4810]: W1003 06:59:55.035106 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda48cb014_8481_4497_9047_8c319809f4cc.slice/crio-afd31301af32390b2e12643163aeb1846a5c205df7491b1d3122924166be8d4e WatchSource:0}: Error finding container afd31301af32390b2e12643163aeb1846a5c205df7491b1d3122924166be8d4e: Status 404 returned error can't find the container with id afd31301af32390b2e12643163aeb1846a5c205df7491b1d3122924166be8d4e Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.430044 4810 generic.go:334] "Generic (PLEG): container finished" podID="a48cb014-8481-4497-9047-8c319809f4cc" containerID="5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735" exitCode=0 Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.430138 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerDied","Data":"5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.430479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerStarted","Data":"afd31301af32390b2e12643163aeb1846a5c205df7491b1d3122924166be8d4e"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.435141 4810 generic.go:334] "Generic (PLEG): container finished" podID="276bc9fb-9280-4ec8-8f8a-32f482040f97" containerID="f626141f74648add26206fa8642188d57e73323cc903e1d40dd51c8b1cf9ddcc" exitCode=0 Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.435230 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw9w" event={"ID":"276bc9fb-9280-4ec8-8f8a-32f482040f97","Type":"ContainerDied","Data":"f626141f74648add26206fa8642188d57e73323cc903e1d40dd51c8b1cf9ddcc"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.435309 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw9w" event={"ID":"276bc9fb-9280-4ec8-8f8a-32f482040f97","Type":"ContainerStarted","Data":"5a24a976550c6c2202fa1068efcc5a68900e0b92a69d1dc562647aaea672157f"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.442037 4810 generic.go:334] "Generic (PLEG): container finished" podID="0a41a867-b179-4665-b79f-de759a045f82" containerID="2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781" exitCode=0 Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.442113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerDied","Data":"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.442137 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerStarted","Data":"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.447823 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9w6dm" event={"ID":"c238f470-c89c-4992-8bb7-ad1c2cd58553","Type":"ContainerStarted","Data":"7cc8eb856da3875594affb09b1522b4f97acaf389ce55ee223cc830a713e4c8f"} Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.472302 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmvkz" podStartSLOduration=1.980431119 podStartE2EDuration="3.472276719s" podCreationTimestamp="2025-10-03 06:59:52 +0000 UTC" firstStartedPulling="2025-10-03 06:59:53.40750729 +0000 UTC m=+226.834758025" lastFinishedPulling="2025-10-03 06:59:54.89935289 +0000 UTC m=+228.326603625" observedRunningTime="2025-10-03 06:59:55.468541771 +0000 UTC m=+228.895792526" watchObservedRunningTime="2025-10-03 06:59:55.472276719 +0000 UTC m=+228.899527464" Oct 03 06:59:55 crc kubenswrapper[4810]: I1003 06:59:55.492501 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9w6dm" podStartSLOduration=2.967809185 podStartE2EDuration="4.492480773s" podCreationTimestamp="2025-10-03 06:59:51 +0000 UTC" firstStartedPulling="2025-10-03 06:59:53.402511306 +0000 UTC m=+226.829762041" lastFinishedPulling="2025-10-03 06:59:54.927182894 +0000 UTC m=+228.354433629" observedRunningTime="2025-10-03 06:59:55.489586539 +0000 UTC m=+228.916837274" watchObservedRunningTime="2025-10-03 06:59:55.492480773 +0000 UTC m=+228.919731508" Oct 03 06:59:56 crc kubenswrapper[4810]: I1003 06:59:56.456029 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerStarted","Data":"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2"} Oct 03 06:59:57 crc kubenswrapper[4810]: I1003 06:59:57.463078 4810 generic.go:334] "Generic (PLEG): container finished" podID="a48cb014-8481-4497-9047-8c319809f4cc" containerID="2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2" exitCode=0 Oct 03 06:59:57 crc kubenswrapper[4810]: I1003 06:59:57.463510 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerDied","Data":"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2"} Oct 03 06:59:57 crc kubenswrapper[4810]: I1003 06:59:57.466748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw9w" event={"ID":"276bc9fb-9280-4ec8-8f8a-32f482040f97","Type":"ContainerStarted","Data":"58e2f49a6f15a06daee8a5418020d3e3c4887d13d2a79b7199efb37842f2177b"} Oct 03 06:59:58 crc kubenswrapper[4810]: I1003 06:59:58.474398 4810 generic.go:334] "Generic (PLEG): container finished" podID="276bc9fb-9280-4ec8-8f8a-32f482040f97" containerID="58e2f49a6f15a06daee8a5418020d3e3c4887d13d2a79b7199efb37842f2177b" exitCode=0 Oct 03 06:59:58 crc kubenswrapper[4810]: I1003 06:59:58.474522 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw9w" event={"ID":"276bc9fb-9280-4ec8-8f8a-32f482040f97","Type":"ContainerDied","Data":"58e2f49a6f15a06daee8a5418020d3e3c4887d13d2a79b7199efb37842f2177b"} Oct 03 06:59:59 crc kubenswrapper[4810]: I1003 06:59:59.483854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerStarted","Data":"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148"} Oct 03 06:59:59 crc kubenswrapper[4810]: I1003 06:59:59.504593 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s8hjs" podStartSLOduration=2.249973642 podStartE2EDuration="5.504571508s" podCreationTimestamp="2025-10-03 06:59:54 +0000 UTC" firstStartedPulling="2025-10-03 06:59:55.432003486 +0000 UTC m=+228.859254221" lastFinishedPulling="2025-10-03 06:59:58.686601352 +0000 UTC m=+232.113852087" observedRunningTime="2025-10-03 06:59:59.503349073 +0000 UTC m=+232.930599808" watchObservedRunningTime="2025-10-03 06:59:59.504571508 +0000 UTC m=+232.931822243" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.141009 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69"] Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.142114 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.152483 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.152597 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.153242 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69"] Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.244421 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.244510 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkdj7\" (UniqueName: \"kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.244545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.346646 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.347282 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.347333 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkdj7\" (UniqueName: \"kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.348437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.354127 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.368348 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkdj7\" (UniqueName: \"kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7\") pod \"collect-profiles-29324580-tnt69\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.464770 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:00 crc kubenswrapper[4810]: I1003 07:00:00.815173 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69"] Oct 03 07:00:01 crc kubenswrapper[4810]: I1003 07:00:01.493571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" event={"ID":"2f286eee-b0b9-45c5-95d9-5ac9c44376ed","Type":"ContainerStarted","Data":"299f3eb88c9d6df707e6cd42f39d9af1e69f5f9af480e2006a6277fbf8bda1c1"} Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.142054 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.142394 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.191565 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.360335 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.360478 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.409958 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.500938 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qfw9w" event={"ID":"276bc9fb-9280-4ec8-8f8a-32f482040f97","Type":"ContainerStarted","Data":"610ba3afb484a2f18488d7c37c79896e0013ec64487c54e52fcdea2eb1d94600"} Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.503190 4810 generic.go:334] "Generic (PLEG): container finished" podID="2f286eee-b0b9-45c5-95d9-5ac9c44376ed" containerID="fa5bbe7f3f0cba1cdc8bb399b734e2c841c6dcbc14c84d846c08c3d3d1ce32a3" exitCode=0 Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.503279 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" event={"ID":"2f286eee-b0b9-45c5-95d9-5ac9c44376ed","Type":"ContainerDied","Data":"fa5bbe7f3f0cba1cdc8bb399b734e2c841c6dcbc14c84d846c08c3d3d1ce32a3"} Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.522185 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qfw9w" podStartSLOduration=3.349292836 podStartE2EDuration="8.52215288s" podCreationTimestamp="2025-10-03 06:59:54 +0000 UTC" firstStartedPulling="2025-10-03 06:59:55.439146112 +0000 UTC m=+228.866396847" lastFinishedPulling="2025-10-03 07:00:00.612006156 +0000 UTC m=+234.039256891" observedRunningTime="2025-10-03 07:00:02.518960587 +0000 UTC m=+235.946211332" watchObservedRunningTime="2025-10-03 07:00:02.52215288 +0000 UTC m=+235.949403615" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.550027 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.562766 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9w6dm" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.674798 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p9mbq" Oct 03 07:00:02 crc kubenswrapper[4810]: I1003 07:00:02.731501 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.759681 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.903076 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume\") pod \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.903126 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkdj7\" (UniqueName: \"kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7\") pod \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.903154 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume\") pod \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\" (UID: \"2f286eee-b0b9-45c5-95d9-5ac9c44376ed\") " Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.904394 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f286eee-b0b9-45c5-95d9-5ac9c44376ed" (UID: "2f286eee-b0b9-45c5-95d9-5ac9c44376ed"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.909732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7" (OuterVolumeSpecName: "kube-api-access-jkdj7") pod "2f286eee-b0b9-45c5-95d9-5ac9c44376ed" (UID: "2f286eee-b0b9-45c5-95d9-5ac9c44376ed"). InnerVolumeSpecName "kube-api-access-jkdj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:00:03 crc kubenswrapper[4810]: I1003 07:00:03.910532 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f286eee-b0b9-45c5-95d9-5ac9c44376ed" (UID: "2f286eee-b0b9-45c5-95d9-5ac9c44376ed"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.004884 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.004949 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkdj7\" (UniqueName: \"kubernetes.io/projected/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-kube-api-access-jkdj7\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.004960 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f286eee-b0b9-45c5-95d9-5ac9c44376ed-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.515620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" event={"ID":"2f286eee-b0b9-45c5-95d9-5ac9c44376ed","Type":"ContainerDied","Data":"299f3eb88c9d6df707e6cd42f39d9af1e69f5f9af480e2006a6277fbf8bda1c1"} Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.515686 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299f3eb88c9d6df707e6cd42f39d9af1e69f5f9af480e2006a6277fbf8bda1c1" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.515655 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.583401 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.583480 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.824555 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.824620 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:00:04 crc kubenswrapper[4810]: I1003 07:00:04.868407 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:00:05 crc kubenswrapper[4810]: I1003 07:00:05.569769 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:00:05 crc kubenswrapper[4810]: I1003 07:00:05.640741 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qfw9w" podUID="276bc9fb-9280-4ec8-8f8a-32f482040f97" containerName="registry-server" probeResult="failure" output=< Oct 03 07:00:05 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 07:00:05 crc kubenswrapper[4810]: > Oct 03 07:00:12 crc kubenswrapper[4810]: I1003 07:00:12.694863 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" containerID="cri-o://2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab" gracePeriod=15 Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.058452 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.089585 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp"] Oct 03 07:00:13 crc kubenswrapper[4810]: E1003 07:00:13.089805 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.089818 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" Oct 03 07:00:13 crc kubenswrapper[4810]: E1003 07:00:13.089844 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f286eee-b0b9-45c5-95d9-5ac9c44376ed" containerName="collect-profiles" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.089851 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f286eee-b0b9-45c5-95d9-5ac9c44376ed" containerName="collect-profiles" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.089965 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5629a2-a195-41d3-a776-551b79952630" containerName="oauth-openshift" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.089981 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f286eee-b0b9-45c5-95d9-5ac9c44376ed" containerName="collect-profiles" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.090408 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.106087 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp"] Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.174359 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.174845 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175037 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175170 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175382 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175675 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbn2d\" (UniqueName: \"kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175929 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176051 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.175816 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176157 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176381 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176442 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection\") pod \"bf5629a2-a195-41d3-a776-551b79952630\" (UID: \"bf5629a2-a195-41d3-a776-551b79952630\") " Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.176940 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zknl4\" (UniqueName: \"kubernetes.io/projected/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-kube-api-access-zknl4\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177011 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177079 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177151 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177268 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177165 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.177304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-service-ca\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.178013 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.178782 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.178609 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.179260 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.180214 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-session\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181036 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-error\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-dir\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181133 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-login\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181161 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-policies\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-router-certs\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181310 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181307 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181329 4810 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf5629a2-a195-41d3-a776-551b79952630-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181419 4810 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181436 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181449 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.181924 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.182585 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.182875 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d" (OuterVolumeSpecName: "kube-api-access-qbn2d") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "kube-api-access-qbn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.183462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.183707 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.184462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.188545 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.189401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "bf5629a2-a195-41d3-a776-551b79952630" (UID: "bf5629a2-a195-41d3-a776-551b79952630"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zknl4\" (UniqueName: \"kubernetes.io/projected/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-kube-api-access-zknl4\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283356 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-service-ca\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283480 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283495 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-session\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-error\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283556 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-dir\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283575 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-login\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283593 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-policies\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283612 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-router-certs\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283671 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283682 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283694 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbn2d\" (UniqueName: \"kubernetes.io/projected/bf5629a2-a195-41d3-a776-551b79952630-kube-api-access-qbn2d\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283704 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283714 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283724 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283735 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283745 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.283755 4810 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bf5629a2-a195-41d3-a776-551b79952630-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.284746 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-dir\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.285417 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-service-ca\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.285882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.286872 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-audit-policies\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.287194 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.287539 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-router-certs\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.287545 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.289327 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-session\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.297127 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.297276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-login\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.297296 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.298699 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.298945 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-v4-0-config-user-template-error\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.304669 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zknl4\" (UniqueName: \"kubernetes.io/projected/4dfa791b-d340-4d5c-a4f0-a4bc6001d98b-kube-api-access-zknl4\") pod \"oauth-openshift-846dc6fc5d-qh6cp\" (UID: \"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b\") " pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.409320 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.568391 4810 generic.go:334] "Generic (PLEG): container finished" podID="bf5629a2-a195-41d3-a776-551b79952630" containerID="2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab" exitCode=0 Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.568453 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.568475 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" event={"ID":"bf5629a2-a195-41d3-a776-551b79952630","Type":"ContainerDied","Data":"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab"} Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.568979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xwhmw" event={"ID":"bf5629a2-a195-41d3-a776-551b79952630","Type":"ContainerDied","Data":"bf1fcbc36fd3b09bbde16cd812b032d50afdf0efd9cbc90866291586fc8eeed7"} Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.569233 4810 scope.go:117] "RemoveContainer" containerID="2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.591282 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.596304 4810 scope.go:117] "RemoveContainer" containerID="2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab" Oct 03 07:00:13 crc kubenswrapper[4810]: E1003 07:00:13.596726 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab\": container with ID starting with 2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab not found: ID does not exist" containerID="2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.596764 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab"} err="failed to get container status \"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab\": rpc error: code = NotFound desc = could not find container \"2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab\": container with ID starting with 2517c24f5a63ffb49ea2831f2a7396ce3c098ef581dd33ad96c601c56bd366ab not found: ID does not exist" Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.599522 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xwhmw"] Oct 03 07:00:13 crc kubenswrapper[4810]: I1003 07:00:13.603792 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp"] Oct 03 07:00:13 crc kubenswrapper[4810]: W1003 07:00:13.606029 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dfa791b_d340_4d5c_a4f0_a4bc6001d98b.slice/crio-b2c782f5d1586db8c1753c1ebaa2498bf19081bec4c543edb90a12e97e5c2609 WatchSource:0}: Error finding container b2c782f5d1586db8c1753c1ebaa2498bf19081bec4c543edb90a12e97e5c2609: Status 404 returned error can't find the container with id b2c782f5d1586db8c1753c1ebaa2498bf19081bec4c543edb90a12e97e5c2609 Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.577376 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" event={"ID":"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b","Type":"ContainerStarted","Data":"b7384599c1307c44f963a35a9a455190199c6e66027df5e370cd42b97d0da579"} Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.577451 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" event={"ID":"4dfa791b-d340-4d5c-a4f0-a4bc6001d98b","Type":"ContainerStarted","Data":"b2c782f5d1586db8c1753c1ebaa2498bf19081bec4c543edb90a12e97e5c2609"} Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.579670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.585654 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.602261 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-846dc6fc5d-qh6cp" podStartSLOduration=27.602238821 podStartE2EDuration="27.602238821s" podCreationTimestamp="2025-10-03 06:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:00:14.598766041 +0000 UTC m=+248.026016776" watchObservedRunningTime="2025-10-03 07:00:14.602238821 +0000 UTC m=+248.029489556" Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.686671 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 07:00:14 crc kubenswrapper[4810]: I1003 07:00:14.785205 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qfw9w" Oct 03 07:00:15 crc kubenswrapper[4810]: I1003 07:00:15.314302 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5629a2-a195-41d3-a776-551b79952630" path="/var/lib/kubelet/pods/bf5629a2-a195-41d3-a776-551b79952630/volumes" Oct 03 07:00:27 crc kubenswrapper[4810]: I1003 07:00:27.778156 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" podUID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" containerName="registry" containerID="cri-o://6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7" gracePeriod=30 Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.138837 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215614 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215669 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215697 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215738 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2467\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215876 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215953 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.215976 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.216005 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca\") pod \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\" (UID: \"53718cb5-d23a-45d2-9b8a-61fd2e8903e5\") " Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.216715 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.216845 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.221315 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467" (OuterVolumeSpecName: "kube-api-access-j2467") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "kube-api-access-j2467". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.221967 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.222114 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.222270 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.229613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.231920 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "53718cb5-d23a-45d2-9b8a-61fd2e8903e5" (UID: "53718cb5-d23a-45d2-9b8a-61fd2e8903e5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318419 4810 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318499 4810 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318532 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318558 4810 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318583 4810 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318605 4810 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.318629 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2467\" (UniqueName: \"kubernetes.io/projected/53718cb5-d23a-45d2-9b8a-61fd2e8903e5-kube-api-access-j2467\") on node \"crc\" DevicePath \"\"" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.686087 4810 generic.go:334] "Generic (PLEG): container finished" podID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" containerID="6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7" exitCode=0 Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.686145 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.686157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" event={"ID":"53718cb5-d23a-45d2-9b8a-61fd2e8903e5","Type":"ContainerDied","Data":"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7"} Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.686207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-59nxd" event={"ID":"53718cb5-d23a-45d2-9b8a-61fd2e8903e5","Type":"ContainerDied","Data":"4f34f4ec36125c6523f90379a83a7ebe6a878479500b9b9586852048a528785b"} Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.686237 4810 scope.go:117] "RemoveContainer" containerID="6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.718276 4810 scope.go:117] "RemoveContainer" containerID="6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7" Oct 03 07:00:28 crc kubenswrapper[4810]: E1003 07:00:28.718827 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7\": container with ID starting with 6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7 not found: ID does not exist" containerID="6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.718884 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7"} err="failed to get container status \"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7\": rpc error: code = NotFound desc = could not find container \"6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7\": container with ID starting with 6cfa0c3a2a1bdede3f6737142c5daf886f2a7f3932d0797f52f88f2540796fe7 not found: ID does not exist" Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.727964 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 07:00:28 crc kubenswrapper[4810]: I1003 07:00:28.735948 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-59nxd"] Oct 03 07:00:29 crc kubenswrapper[4810]: I1003 07:00:29.313950 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" path="/var/lib/kubelet/pods/53718cb5-d23a-45d2-9b8a-61fd2e8903e5/volumes" Oct 03 07:01:32 crc kubenswrapper[4810]: I1003 07:01:32.090070 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:01:32 crc kubenswrapper[4810]: I1003 07:01:32.091060 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:02:02 crc kubenswrapper[4810]: I1003 07:02:02.089625 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:02:02 crc kubenswrapper[4810]: I1003 07:02:02.090455 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.090044 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.091526 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.091614 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.092732 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.092869 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5" gracePeriod=600 Oct 03 07:02:32 crc kubenswrapper[4810]: E1003 07:02:32.171525 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12d3cfb_2ba7_4eb6_b6b4_bfc4cec93930.slice/crio-conmon-6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.545443 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5" exitCode=0 Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.545534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5"} Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.546204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b"} Oct 03 07:02:32 crc kubenswrapper[4810]: I1003 07:02:32.546317 4810 scope.go:117] "RemoveContainer" containerID="ed5f69f0806c72526f0ee0a4b4c4f26e79e4077b8ae8dc74bb9492e414fd0ae6" Oct 03 07:04:32 crc kubenswrapper[4810]: I1003 07:04:32.088872 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:04:32 crc kubenswrapper[4810]: I1003 07:04:32.089476 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:05:02 crc kubenswrapper[4810]: I1003 07:05:02.089572 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:05:02 crc kubenswrapper[4810]: I1003 07:05:02.091032 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.089143 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.090039 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.090124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.091265 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.091384 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b" gracePeriod=600 Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.684548 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b" exitCode=0 Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.684618 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b"} Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.684861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3"} Oct 03 07:05:32 crc kubenswrapper[4810]: I1003 07:05:32.684908 4810 scope.go:117] "RemoveContainer" containerID="6cddd7a640992428663cedcc8602ac5cdfc362bfc36dd9482cd6feeef0114cc5" Oct 03 07:07:32 crc kubenswrapper[4810]: I1003 07:07:32.088769 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:07:32 crc kubenswrapper[4810]: I1003 07:07:32.089474 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.588037 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whnpx"] Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589687 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-controller" containerID="cri-o://34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589771 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="sbdb" containerID="cri-o://4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589838 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-node" containerID="cri-o://b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589940 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-acl-logging" containerID="cri-o://2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589917 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="northd" containerID="cri-o://48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589887 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.589729 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="nbdb" containerID="cri-o://ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.642343 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" containerID="cri-o://6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" gracePeriod=30 Oct 03 07:08:00 crc kubenswrapper[4810]: I1003 07:08:00.996321 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/3.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.000296 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovn-acl-logging/0.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.001101 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovn-controller/0.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.001673 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.061990 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nhk8j"] Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.062456 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.062542 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.062596 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.062647 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.062736 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-node" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.062811 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-node" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.062864 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kubecfg-setup" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.062940 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kubecfg-setup" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.062989 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063045 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063097 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-acl-logging" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063151 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-acl-logging" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063207 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" containerName="registry" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063257 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" containerName="registry" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063308 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063358 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063414 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063465 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063517 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="nbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063566 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="nbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063618 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="sbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063666 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="sbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.063718 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="northd" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063766 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="northd" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.063936 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="nbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064004 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064056 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064112 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064164 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="sbdb" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064216 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="northd" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064269 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovn-acl-logging" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064322 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064372 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064423 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="53718cb5-d23a-45d2-9b8a-61fd2e8903e5" containerName="registry" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064472 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="kube-rbac-proxy-node" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.064600 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064651 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064789 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.064847 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.064996 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.065082 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerName="ovnkube-controller" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.066480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.129699 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130091 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130172 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130327 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.129866 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130170 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130303 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130585 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130663 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130737 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130598 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash" (OuterVolumeSpecName: "host-slash") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log" (OuterVolumeSpecName: "node-log") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.130769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131059 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131139 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131225 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131290 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131365 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4bdr\" (UniqueName: \"kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131096 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131178 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket" (OuterVolumeSpecName: "log-socket") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131370 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131388 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131607 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131700 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131776 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131840 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131923 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.131972 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132018 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132084 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes\") pod \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\" (UID: \"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e\") " Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132347 4810 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132407 4810 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132467 4810 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132537 4810 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132603 4810 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132665 4810 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132724 4810 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132783 4810 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132839 4810 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132915 4810 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132979 4810 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.133038 4810 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.133098 4810 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132095 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132148 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132193 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.132431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.136989 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.137409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr" (OuterVolumeSpecName: "kube-api-access-b4bdr") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "kube-api-access-b4bdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.144263 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" (UID: "88c6d2ac-d97b-43a1-8bf7-3cc367fe108e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-systemd-units\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-env-overrides\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234246 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-bin\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234275 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234443 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-kubelet\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234567 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-log-socket\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234592 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-config\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234689 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-systemd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234715 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-script-lib\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234779 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-node-log\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234817 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovn-node-metrics-cert\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234847 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.234989 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-ovn\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235027 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wz5\" (UniqueName: \"kubernetes.io/projected/004c338c-7f47-437c-945d-4c4f5fe66bc5-kube-api-access-f9wz5\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235052 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235108 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-slash\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235158 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-etc-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235185 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-netns\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235204 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-netd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235226 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-var-lib-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235310 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235324 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4bdr\" (UniqueName: \"kubernetes.io/projected/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-kube-api-access-b4bdr\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235337 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235349 4810 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235360 4810 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235372 4810 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.235388 4810 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-ovn\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336199 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wz5\" (UniqueName: \"kubernetes.io/projected/004c338c-7f47-437c-945d-4c4f5fe66bc5-kube-api-access-f9wz5\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336229 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336264 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-slash\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-etc-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336326 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-netns\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-netd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336370 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-slash\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336397 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-var-lib-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336443 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-var-lib-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336326 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-ovn\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336474 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-systemd-units\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336480 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-netns\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336534 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-etc-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336547 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-bin\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336512 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-bin\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336508 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-cni-netd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336606 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-systemd-units\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-openvswitch\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-env-overrides\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336761 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336807 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-kubelet\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336840 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336857 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-log-socket\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336939 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-kubelet\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336977 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-log-socket\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.336978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-config\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337094 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-systemd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337136 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-script-lib\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337186 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-node-log\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-run-systemd\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337236 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovn-node-metrics-cert\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337262 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-node-log\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337400 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/004c338c-7f47-437c-945d-4c4f5fe66bc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.337472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-env-overrides\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.338220 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-script-lib\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.339618 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovnkube-config\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.342287 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/004c338c-7f47-437c-945d-4c4f5fe66bc5-ovn-node-metrics-cert\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.366694 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wz5\" (UniqueName: \"kubernetes.io/projected/004c338c-7f47-437c-945d-4c4f5fe66bc5-kube-api-access-f9wz5\") pod \"ovnkube-node-nhk8j\" (UID: \"004c338c-7f47-437c-945d-4c4f5fe66bc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.382315 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.655139 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovnkube-controller/3.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.658282 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovn-acl-logging/0.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659006 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-whnpx_88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/ovn-controller/0.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659447 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659479 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659492 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659509 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659528 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659541 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659535 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659577 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659613 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659631 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659641 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659651 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659556 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" exitCode=143 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659700 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659702 4810 generic.go:334] "Generic (PLEG): container finished" podID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" exitCode=143 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659677 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659909 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659939 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659946 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659953 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659959 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659966 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659972 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.659978 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660002 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660031 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660040 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660046 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660051 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660057 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660062 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660068 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660077 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660082 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660088 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660106 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660112 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660118 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660123 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660128 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660133 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660139 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660145 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660151 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660157 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660164 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-whnpx" event={"ID":"88c6d2ac-d97b-43a1-8bf7-3cc367fe108e","Type":"ContainerDied","Data":"6fe74cdfc6724198f6aa4e4428dc76ed20636bcf9b1e243a03d534e69eda887a"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660174 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660183 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660188 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660194 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660200 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660205 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660211 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660216 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660220 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.660226 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.662009 4810 generic.go:334] "Generic (PLEG): container finished" podID="004c338c-7f47-437c-945d-4c4f5fe66bc5" containerID="645608b2e23098c5c2b7b253420b1dc1c20cb2c3857397dcc64bf808556a286c" exitCode=0 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.662059 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerDied","Data":"645608b2e23098c5c2b7b253420b1dc1c20cb2c3857397dcc64bf808556a286c"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.662076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"00b20f8238c3666846d447046b056d5512441f7c85ebcc79919d1af3b9c4175a"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.664325 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/2.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.664746 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/1.log" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.664790 4810 generic.go:334] "Generic (PLEG): container finished" podID="f9421086-70f1-441e-9aa0-5ac57a048c89" containerID="606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c" exitCode=2 Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.664816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerDied","Data":"606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.664837 4810 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065"} Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.665240 4810 scope.go:117] "RemoveContainer" containerID="606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.665532 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8pnks_openshift-multus(f9421086-70f1-441e-9aa0-5ac57a048c89)\"" pod="openshift-multus/multus-8pnks" podUID="f9421086-70f1-441e-9aa0-5ac57a048c89" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.697603 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.754435 4810 scope.go:117] "RemoveContainer" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.756714 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whnpx"] Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.760638 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-whnpx"] Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.783593 4810 scope.go:117] "RemoveContainer" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.796548 4810 scope.go:117] "RemoveContainer" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.811481 4810 scope.go:117] "RemoveContainer" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.824947 4810 scope.go:117] "RemoveContainer" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.842496 4810 scope.go:117] "RemoveContainer" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.863584 4810 scope.go:117] "RemoveContainer" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.894744 4810 scope.go:117] "RemoveContainer" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.921087 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.921648 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.921760 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} err="failed to get container status \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.921824 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.922272 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": container with ID starting with b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc not found: ID does not exist" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.922317 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} err="failed to get container status \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": rpc error: code = NotFound desc = could not find container \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": container with ID starting with b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.922345 4810 scope.go:117] "RemoveContainer" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.922817 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": container with ID starting with 4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529 not found: ID does not exist" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.922876 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} err="failed to get container status \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": rpc error: code = NotFound desc = could not find container \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": container with ID starting with 4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.922942 4810 scope.go:117] "RemoveContainer" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.923337 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": container with ID starting with ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4 not found: ID does not exist" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.923357 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} err="failed to get container status \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": rpc error: code = NotFound desc = could not find container \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": container with ID starting with ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.923371 4810 scope.go:117] "RemoveContainer" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.923692 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": container with ID starting with 48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9 not found: ID does not exist" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.923736 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} err="failed to get container status \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": rpc error: code = NotFound desc = could not find container \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": container with ID starting with 48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.923761 4810 scope.go:117] "RemoveContainer" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.924424 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": container with ID starting with 3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791 not found: ID does not exist" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.924460 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} err="failed to get container status \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": rpc error: code = NotFound desc = could not find container \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": container with ID starting with 3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.924479 4810 scope.go:117] "RemoveContainer" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.924866 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": container with ID starting with b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657 not found: ID does not exist" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.924953 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} err="failed to get container status \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": rpc error: code = NotFound desc = could not find container \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": container with ID starting with b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.924982 4810 scope.go:117] "RemoveContainer" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.925775 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": container with ID starting with 2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc not found: ID does not exist" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.925818 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} err="failed to get container status \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": rpc error: code = NotFound desc = could not find container \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": container with ID starting with 2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.925844 4810 scope.go:117] "RemoveContainer" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.926256 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": container with ID starting with 34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187 not found: ID does not exist" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.926294 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} err="failed to get container status \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": rpc error: code = NotFound desc = could not find container \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": container with ID starting with 34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.926318 4810 scope.go:117] "RemoveContainer" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: E1003 07:08:01.926699 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": container with ID starting with 2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57 not found: ID does not exist" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.926727 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} err="failed to get container status \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": rpc error: code = NotFound desc = could not find container \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": container with ID starting with 2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.926745 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.927128 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} err="failed to get container status \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.927171 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.927586 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} err="failed to get container status \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": rpc error: code = NotFound desc = could not find container \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": container with ID starting with b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.927673 4810 scope.go:117] "RemoveContainer" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.928492 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} err="failed to get container status \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": rpc error: code = NotFound desc = could not find container \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": container with ID starting with 4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.928524 4810 scope.go:117] "RemoveContainer" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.928875 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} err="failed to get container status \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": rpc error: code = NotFound desc = could not find container \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": container with ID starting with ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.928950 4810 scope.go:117] "RemoveContainer" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933066 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} err="failed to get container status \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": rpc error: code = NotFound desc = could not find container \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": container with ID starting with 48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933113 4810 scope.go:117] "RemoveContainer" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933458 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} err="failed to get container status \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": rpc error: code = NotFound desc = could not find container \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": container with ID starting with 3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933500 4810 scope.go:117] "RemoveContainer" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933857 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} err="failed to get container status \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": rpc error: code = NotFound desc = could not find container \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": container with ID starting with b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.933906 4810 scope.go:117] "RemoveContainer" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.934334 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} err="failed to get container status \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": rpc error: code = NotFound desc = could not find container \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": container with ID starting with 2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.934367 4810 scope.go:117] "RemoveContainer" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.934934 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} err="failed to get container status \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": rpc error: code = NotFound desc = could not find container \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": container with ID starting with 34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.934972 4810 scope.go:117] "RemoveContainer" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935226 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} err="failed to get container status \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": rpc error: code = NotFound desc = could not find container \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": container with ID starting with 2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935267 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935551 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} err="failed to get container status \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935582 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935877 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} err="failed to get container status \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": rpc error: code = NotFound desc = could not find container \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": container with ID starting with b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.935940 4810 scope.go:117] "RemoveContainer" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.936357 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} err="failed to get container status \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": rpc error: code = NotFound desc = could not find container \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": container with ID starting with 4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.936388 4810 scope.go:117] "RemoveContainer" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.936700 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} err="failed to get container status \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": rpc error: code = NotFound desc = could not find container \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": container with ID starting with ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.936731 4810 scope.go:117] "RemoveContainer" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.937079 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} err="failed to get container status \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": rpc error: code = NotFound desc = could not find container \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": container with ID starting with 48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.937112 4810 scope.go:117] "RemoveContainer" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.937564 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} err="failed to get container status \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": rpc error: code = NotFound desc = could not find container \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": container with ID starting with 3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.937608 4810 scope.go:117] "RemoveContainer" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.938070 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} err="failed to get container status \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": rpc error: code = NotFound desc = could not find container \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": container with ID starting with b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.938101 4810 scope.go:117] "RemoveContainer" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.938852 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} err="failed to get container status \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": rpc error: code = NotFound desc = could not find container \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": container with ID starting with 2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.938881 4810 scope.go:117] "RemoveContainer" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.939384 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} err="failed to get container status \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": rpc error: code = NotFound desc = could not find container \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": container with ID starting with 34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.939412 4810 scope.go:117] "RemoveContainer" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.939806 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} err="failed to get container status \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": rpc error: code = NotFound desc = could not find container \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": container with ID starting with 2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.939842 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.940189 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} err="failed to get container status \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.940214 4810 scope.go:117] "RemoveContainer" containerID="b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.940696 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc"} err="failed to get container status \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": rpc error: code = NotFound desc = could not find container \"b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc\": container with ID starting with b4dcc98a88abf879b6d56e65b3ee3d361cf708a115d3aec38c2ceb72207c8afc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.940770 4810 scope.go:117] "RemoveContainer" containerID="4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.941260 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529"} err="failed to get container status \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": rpc error: code = NotFound desc = could not find container \"4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529\": container with ID starting with 4ec60f4a46d199a556ac4c1eedca56e2770728dde394670805be1e7cc055c529 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.941283 4810 scope.go:117] "RemoveContainer" containerID="ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.941606 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4"} err="failed to get container status \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": rpc error: code = NotFound desc = could not find container \"ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4\": container with ID starting with ee2a48e64a390c4ccfff617ed3732c442678ccb94539a7b5575318bca2c7d0c4 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.941672 4810 scope.go:117] "RemoveContainer" containerID="48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942077 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9"} err="failed to get container status \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": rpc error: code = NotFound desc = could not find container \"48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9\": container with ID starting with 48bd32521923f975e7518b9f4dd4afce8e8dfe8f0d8e518b90ceaa4b38a66cd9 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942102 4810 scope.go:117] "RemoveContainer" containerID="3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942476 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791"} err="failed to get container status \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": rpc error: code = NotFound desc = could not find container \"3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791\": container with ID starting with 3aeb4e135167c24838ed4d92837912b6d3c5ab375af0f112fd121278f3bbf791 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942515 4810 scope.go:117] "RemoveContainer" containerID="b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942785 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657"} err="failed to get container status \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": rpc error: code = NotFound desc = could not find container \"b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657\": container with ID starting with b584dd31815cff3430e79ecd329edb1526da4554413e6f8ddb033bcf599e8657 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.942832 4810 scope.go:117] "RemoveContainer" containerID="2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.943162 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc"} err="failed to get container status \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": rpc error: code = NotFound desc = could not find container \"2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc\": container with ID starting with 2c046ee9fc5e1b1cb933c61865e451189ac0a61e5cccb8af91568493083e9bfc not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.943192 4810 scope.go:117] "RemoveContainer" containerID="34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.943790 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187"} err="failed to get container status \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": rpc error: code = NotFound desc = could not find container \"34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187\": container with ID starting with 34a8af207966ff30ba5a5d4b0a298e9f2e0969cb13930b70861b28fc91480187 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.943824 4810 scope.go:117] "RemoveContainer" containerID="2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.944073 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57"} err="failed to get container status \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": rpc error: code = NotFound desc = could not find container \"2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57\": container with ID starting with 2954670955c0c1f7a79e7f015c3eaac947ff7c5109b44dafa057caf51e3e8e57 not found: ID does not exist" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.944092 4810 scope.go:117] "RemoveContainer" containerID="6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828" Oct 03 07:08:01 crc kubenswrapper[4810]: I1003 07:08:01.944273 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828"} err="failed to get container status \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": rpc error: code = NotFound desc = could not find container \"6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828\": container with ID starting with 6a49159cce7886a90093efb833eb5bcdd1ed47b18faed0e84df9661e4b5cc828 not found: ID does not exist" Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.088605 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.089024 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.675678 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"7ceea34c42eca00d295432da972ee39c85beb8929ef686a21fb43757d5e38f2e"} Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.676076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"13007047b201526afbe9b31abea4983e2717db2fd9f5523e476268e9138125af"} Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.676091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"686e88e859a92e0d1de64b9346a693d581e09014dd6fd4c5e773be444b4c2200"} Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.676101 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"fde8db2fecf239600ecdbbc3b13a9e5b011b8fe2445acb4052b2ecef1975c221"} Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.676112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"6320ea0b70119b4533fdee5082c72118221d82dd77b5ccd11121099ffc023ad6"} Oct 03 07:08:02 crc kubenswrapper[4810]: I1003 07:08:02.676121 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"a3c2816e09771ef396bd511dc6bbec94dc60ad0e4f902dc0527d78034153bd6f"} Oct 03 07:08:03 crc kubenswrapper[4810]: I1003 07:08:03.310130 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c6d2ac-d97b-43a1-8bf7-3cc367fe108e" path="/var/lib/kubelet/pods/88c6d2ac-d97b-43a1-8bf7-3cc367fe108e/volumes" Oct 03 07:08:05 crc kubenswrapper[4810]: I1003 07:08:05.695209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"e961c44bc6a43b8480a56f69ad35875142cbdebd27bd64eeffab66fb24b7ee30"} Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.551624 4810 scope.go:117] "RemoveContainer" containerID="899de8766b69c31d60aa3a135f7050965aa51b8fbf5c085651972b3009624065" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.711046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" event={"ID":"004c338c-7f47-437c-945d-4c4f5fe66bc5","Type":"ContainerStarted","Data":"f8cbf2a0e732b8df1f5e089dddb05b686b7dc90aa240c7361981c08215ac9600"} Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.711480 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.711525 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.711537 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.713221 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/2.log" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.742422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.745243 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:07 crc kubenswrapper[4810]: I1003 07:08:07.746831 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" podStartSLOduration=6.746788916 podStartE2EDuration="6.746788916s" podCreationTimestamp="2025-10-03 07:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:08:07.742451598 +0000 UTC m=+721.169702333" watchObservedRunningTime="2025-10-03 07:08:07.746788916 +0000 UTC m=+721.174039651" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.864645 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-h875q"] Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.865872 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.867826 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.868158 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h875q"] Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.869132 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.869441 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.872109 4810 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-fk9ct" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.984879 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.985154 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:11 crc kubenswrapper[4810]: I1003 07:08:11.985279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdhst\" (UniqueName: \"kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.086943 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.087034 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.087066 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdhst\" (UniqueName: \"kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.087560 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.087840 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.108241 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdhst\" (UniqueName: \"kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst\") pod \"crc-storage-crc-h875q\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.189576 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.228251 4810 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(4709afa6193337915121eb826a2bb0aa1416219b450e8adc465fc156e50c6eee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.228380 4810 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(4709afa6193337915121eb826a2bb0aa1416219b450e8adc465fc156e50c6eee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.228411 4810 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(4709afa6193337915121eb826a2bb0aa1416219b450e8adc465fc156e50c6eee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.228524 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(4709afa6193337915121eb826a2bb0aa1416219b450e8adc465fc156e50c6eee): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-h875q" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.743628 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: I1003 07:08:12.744298 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.768636 4810 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(2e4bf4607d2af7bca13d1efdcec2578e6c59717a591c196ad00a80545e3d6cb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.768712 4810 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(2e4bf4607d2af7bca13d1efdcec2578e6c59717a591c196ad00a80545e3d6cb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.768748 4810 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(2e4bf4607d2af7bca13d1efdcec2578e6c59717a591c196ad00a80545e3d6cb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:12 crc kubenswrapper[4810]: E1003 07:08:12.768817 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(2e4bf4607d2af7bca13d1efdcec2578e6c59717a591c196ad00a80545e3d6cb7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-h875q" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" Oct 03 07:08:15 crc kubenswrapper[4810]: I1003 07:08:15.303038 4810 scope.go:117] "RemoveContainer" containerID="606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c" Oct 03 07:08:15 crc kubenswrapper[4810]: E1003 07:08:15.305089 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8pnks_openshift-multus(f9421086-70f1-441e-9aa0-5ac57a048c89)\"" pod="openshift-multus/multus-8pnks" podUID="f9421086-70f1-441e-9aa0-5ac57a048c89" Oct 03 07:08:26 crc kubenswrapper[4810]: I1003 07:08:26.302466 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:26 crc kubenswrapper[4810]: I1003 07:08:26.303862 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:26 crc kubenswrapper[4810]: E1003 07:08:26.347053 4810 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(fe1bd769650941f4a70126139cebce6ac15893e7a9d6d9337f99166188414087): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 07:08:26 crc kubenswrapper[4810]: E1003 07:08:26.347144 4810 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(fe1bd769650941f4a70126139cebce6ac15893e7a9d6d9337f99166188414087): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:26 crc kubenswrapper[4810]: E1003 07:08:26.347184 4810 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(fe1bd769650941f4a70126139cebce6ac15893e7a9d6d9337f99166188414087): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:26 crc kubenswrapper[4810]: E1003 07:08:26.347311 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-h875q_crc-storage(6897b018-d1be-4256-9e81-6f8836a0e303)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-h875q_crc-storage_6897b018-d1be-4256-9e81-6f8836a0e303_0(fe1bd769650941f4a70126139cebce6ac15893e7a9d6d9337f99166188414087): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-h875q" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" Oct 03 07:08:29 crc kubenswrapper[4810]: I1003 07:08:29.302410 4810 scope.go:117] "RemoveContainer" containerID="606d36543eb17d98b9f89c03aee5802c5d94aa282ffb1408a0714164f288128c" Oct 03 07:08:29 crc kubenswrapper[4810]: I1003 07:08:29.855862 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pnks_f9421086-70f1-441e-9aa0-5ac57a048c89/kube-multus/2.log" Oct 03 07:08:29 crc kubenswrapper[4810]: I1003 07:08:29.856241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pnks" event={"ID":"f9421086-70f1-441e-9aa0-5ac57a048c89","Type":"ContainerStarted","Data":"909a37ace0afe1da0b5bcfee12098a2304a9e663ee9d7fb0757e49df990153fa"} Oct 03 07:08:31 crc kubenswrapper[4810]: I1003 07:08:31.416758 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhk8j" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.089583 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.089675 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.089743 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.090625 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.090718 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3" gracePeriod=600 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.187540 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.187849 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerName="controller-manager" containerID="cri-o://e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9" gracePeriod=30 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.277365 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.278016 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerName="route-controller-manager" containerID="cri-o://359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7" gracePeriod=30 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.576794 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.643006 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673681 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7gcw\" (UniqueName: \"kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw\") pod \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673723 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles\") pod \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673746 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert\") pod \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673805 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn7zl\" (UniqueName: \"kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl\") pod \"5bed5787-1f8e-4259-9cb9-c6edb845df27\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673839 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config\") pod \"5bed5787-1f8e-4259-9cb9-c6edb845df27\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673855 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca\") pod \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673881 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert\") pod \"5bed5787-1f8e-4259-9cb9-c6edb845df27\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673930 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config\") pod \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\" (UID: \"1023edcf-7aaf-4340-96dd-af9d1fb114ea\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.673960 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca\") pod \"5bed5787-1f8e-4259-9cb9-c6edb845df27\" (UID: \"5bed5787-1f8e-4259-9cb9-c6edb845df27\") " Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674565 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "1023edcf-7aaf-4340-96dd-af9d1fb114ea" (UID: "1023edcf-7aaf-4340-96dd-af9d1fb114ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674652 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config" (OuterVolumeSpecName: "config") pod "5bed5787-1f8e-4259-9cb9-c6edb845df27" (UID: "5bed5787-1f8e-4259-9cb9-c6edb845df27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674571 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca" (OuterVolumeSpecName: "client-ca") pod "5bed5787-1f8e-4259-9cb9-c6edb845df27" (UID: "5bed5787-1f8e-4259-9cb9-c6edb845df27"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674665 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config" (OuterVolumeSpecName: "config") pod "1023edcf-7aaf-4340-96dd-af9d1fb114ea" (UID: "1023edcf-7aaf-4340-96dd-af9d1fb114ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674931 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674951 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674962 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.674970 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5bed5787-1f8e-4259-9cb9-c6edb845df27-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.675183 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1023edcf-7aaf-4340-96dd-af9d1fb114ea" (UID: "1023edcf-7aaf-4340-96dd-af9d1fb114ea"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.680112 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1023edcf-7aaf-4340-96dd-af9d1fb114ea" (UID: "1023edcf-7aaf-4340-96dd-af9d1fb114ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.680301 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw" (OuterVolumeSpecName: "kube-api-access-j7gcw") pod "1023edcf-7aaf-4340-96dd-af9d1fb114ea" (UID: "1023edcf-7aaf-4340-96dd-af9d1fb114ea"). InnerVolumeSpecName "kube-api-access-j7gcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.680694 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl" (OuterVolumeSpecName: "kube-api-access-wn7zl") pod "5bed5787-1f8e-4259-9cb9-c6edb845df27" (UID: "5bed5787-1f8e-4259-9cb9-c6edb845df27"). InnerVolumeSpecName "kube-api-access-wn7zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.681196 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5bed5787-1f8e-4259-9cb9-c6edb845df27" (UID: "5bed5787-1f8e-4259-9cb9-c6edb845df27"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.776655 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7gcw\" (UniqueName: \"kubernetes.io/projected/1023edcf-7aaf-4340-96dd-af9d1fb114ea-kube-api-access-j7gcw\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.776696 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1023edcf-7aaf-4340-96dd-af9d1fb114ea-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.776708 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1023edcf-7aaf-4340-96dd-af9d1fb114ea-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.776720 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn7zl\" (UniqueName: \"kubernetes.io/projected/5bed5787-1f8e-4259-9cb9-c6edb845df27-kube-api-access-wn7zl\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.776732 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bed5787-1f8e-4259-9cb9-c6edb845df27-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.873049 4810 generic.go:334] "Generic (PLEG): container finished" podID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerID="e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9" exitCode=0 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.873087 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" event={"ID":"1023edcf-7aaf-4340-96dd-af9d1fb114ea","Type":"ContainerDied","Data":"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.873127 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" event={"ID":"1023edcf-7aaf-4340-96dd-af9d1fb114ea","Type":"ContainerDied","Data":"511213e4c58353bb7a259cf842261c32e5783db78ef8fa274a2b387baf950916"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.873145 4810 scope.go:117] "RemoveContainer" containerID="e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.873143 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-vjxst" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.876500 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3" exitCode=0 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.876563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.876606 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.878355 4810 generic.go:334] "Generic (PLEG): container finished" podID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerID="359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7" exitCode=0 Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.878393 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" event={"ID":"5bed5787-1f8e-4259-9cb9-c6edb845df27","Type":"ContainerDied","Data":"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.878424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" event={"ID":"5bed5787-1f8e-4259-9cb9-c6edb845df27","Type":"ContainerDied","Data":"def23873d41b215d542f59424bb31945caf6c1fbff12b6157e99b9ee3ce504a7"} Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.878398 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.887785 4810 scope.go:117] "RemoveContainer" containerID="e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9" Oct 03 07:08:32 crc kubenswrapper[4810]: E1003 07:08:32.888430 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9\": container with ID starting with e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9 not found: ID does not exist" containerID="e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.888471 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9"} err="failed to get container status \"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9\": rpc error: code = NotFound desc = could not find container \"e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9\": container with ID starting with e8eb15865cf1b9938f0bee19de2416db37ad741fea48db448e53eb9e4a45f2a9 not found: ID does not exist" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.888495 4810 scope.go:117] "RemoveContainer" containerID="2fc25fdac2d24f907caaff90273ae72cdeb27e4c9cc850b5c771befb0ece247b" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.912972 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.921058 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-84l77"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.926178 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.926657 4810 scope.go:117] "RemoveContainer" containerID="359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.929806 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-vjxst"] Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.938853 4810 scope.go:117] "RemoveContainer" containerID="359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7" Oct 03 07:08:32 crc kubenswrapper[4810]: E1003 07:08:32.939244 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7\": container with ID starting with 359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7 not found: ID does not exist" containerID="359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7" Oct 03 07:08:32 crc kubenswrapper[4810]: I1003 07:08:32.939282 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7"} err="failed to get container status \"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7\": rpc error: code = NotFound desc = could not find container \"359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7\": container with ID starting with 359d033232954a30edff8a92055096bc2f0fada7fe711606a1faca89d19ba0d7 not found: ID does not exist" Oct 03 07:08:33 crc kubenswrapper[4810]: I1003 07:08:33.318856 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" path="/var/lib/kubelet/pods/1023edcf-7aaf-4340-96dd-af9d1fb114ea/volumes" Oct 03 07:08:33 crc kubenswrapper[4810]: I1003 07:08:33.320053 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" path="/var/lib/kubelet/pods/5bed5787-1f8e-4259-9cb9-c6edb845df27/volumes" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.046999 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d598cfccb-47lzl"] Oct 03 07:08:34 crc kubenswrapper[4810]: E1003 07:08:34.047286 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerName="route-controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.047304 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerName="route-controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: E1003 07:08:34.047324 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerName="controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.047335 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerName="controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.047471 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1023edcf-7aaf-4340-96dd-af9d1fb114ea" containerName="controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.047495 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bed5787-1f8e-4259-9cb9-c6edb845df27" containerName="route-controller-manager" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.047968 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.056490 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj"] Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.056575 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.057357 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.058210 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.061554 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.062374 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.063287 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.063678 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.064508 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.067613 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.070759 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.072348 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.072382 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.074385 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.074512 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.076527 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d598cfccb-47lzl"] Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095094 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-config\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095248 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snst9\" (UniqueName: \"kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095332 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-client-ca\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095362 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095385 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6944bc79-13b3-420b-ac7e-8642afef1647-serving-cert\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095408 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmnr\" (UniqueName: \"kubernetes.io/projected/6944bc79-13b3-420b-ac7e-8642afef1647-kube-api-access-nsmnr\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.095571 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj"] Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.180014 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d598cfccb-47lzl"] Oct 03 07:08:34 crc kubenswrapper[4810]: E1003 07:08:34.180415 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-snst9 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" podUID="70715765-6777-43b0-8003-b8c047a60aed" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196439 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196504 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196530 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snst9\" (UniqueName: \"kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-client-ca\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196619 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6944bc79-13b3-420b-ac7e-8642afef1647-serving-cert\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmnr\" (UniqueName: \"kubernetes.io/projected/6944bc79-13b3-420b-ac7e-8642afef1647-kube-api-access-nsmnr\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196674 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.196693 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-config\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.197839 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-client-ca\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.198049 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6944bc79-13b3-420b-ac7e-8642afef1647-config\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.198465 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.198639 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.199589 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.205232 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6944bc79-13b3-420b-ac7e-8642afef1647-serving-cert\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.205418 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.219150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmnr\" (UniqueName: \"kubernetes.io/projected/6944bc79-13b3-420b-ac7e-8642afef1647-kube-api-access-nsmnr\") pod \"route-controller-manager-648d56fc4f-9qltj\" (UID: \"6944bc79-13b3-420b-ac7e-8642afef1647\") " pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.219807 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snst9\" (UniqueName: \"kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9\") pod \"controller-manager-5d598cfccb-47lzl\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.406750 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.673506 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj"] Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.898039 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.899147 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" event={"ID":"6944bc79-13b3-420b-ac7e-8642afef1647","Type":"ContainerStarted","Data":"5b220d2fd65b8935972ec4e009301b44d33fab066827ba944f10941239508de4"} Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.899244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" event={"ID":"6944bc79-13b3-420b-ac7e-8642afef1647","Type":"ContainerStarted","Data":"77e8f97a872c132721857ad82da2d467798d47fb75f1817264f182ccb2416cb0"} Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.899279 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.911294 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:34 crc kubenswrapper[4810]: I1003 07:08:34.924204 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" podStartSLOduration=2.924182784 podStartE2EDuration="2.924182784s" podCreationTimestamp="2025-10-03 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:08:34.922146239 +0000 UTC m=+748.349396984" watchObservedRunningTime="2025-10-03 07:08:34.924182784 +0000 UTC m=+748.351433519" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008106 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert\") pod \"70715765-6777-43b0-8003-b8c047a60aed\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008201 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca\") pod \"70715765-6777-43b0-8003-b8c047a60aed\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snst9\" (UniqueName: \"kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9\") pod \"70715765-6777-43b0-8003-b8c047a60aed\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008259 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config\") pod \"70715765-6777-43b0-8003-b8c047a60aed\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008344 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles\") pod \"70715765-6777-43b0-8003-b8c047a60aed\" (UID: \"70715765-6777-43b0-8003-b8c047a60aed\") " Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008882 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca" (OuterVolumeSpecName: "client-ca") pod "70715765-6777-43b0-8003-b8c047a60aed" (UID: "70715765-6777-43b0-8003-b8c047a60aed"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.008964 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config" (OuterVolumeSpecName: "config") pod "70715765-6777-43b0-8003-b8c047a60aed" (UID: "70715765-6777-43b0-8003-b8c047a60aed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.009121 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "70715765-6777-43b0-8003-b8c047a60aed" (UID: "70715765-6777-43b0-8003-b8c047a60aed"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.017042 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "70715765-6777-43b0-8003-b8c047a60aed" (UID: "70715765-6777-43b0-8003-b8c047a60aed"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.017145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9" (OuterVolumeSpecName: "kube-api-access-snst9") pod "70715765-6777-43b0-8003-b8c047a60aed" (UID: "70715765-6777-43b0-8003-b8c047a60aed"). InnerVolumeSpecName "kube-api-access-snst9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.110133 4810 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70715765-6777-43b0-8003-b8c047a60aed-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.110181 4810 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.110202 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snst9\" (UniqueName: \"kubernetes.io/projected/70715765-6777-43b0-8003-b8c047a60aed-kube-api-access-snst9\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.110219 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.110237 4810 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70715765-6777-43b0-8003-b8c047a60aed-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.191386 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-648d56fc4f-9qltj" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.904194 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d598cfccb-47lzl" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.940194 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d598cfccb-47lzl"] Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.943650 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b48b84df4-8p6gs"] Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.944930 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.947485 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.947576 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.947711 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.947843 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d598cfccb-47lzl"] Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.948231 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.948237 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.949028 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.958346 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 07:08:35 crc kubenswrapper[4810]: I1003 07:08:35.967698 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b48b84df4-8p6gs"] Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.023093 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk65r\" (UniqueName: \"kubernetes.io/projected/5b932d11-7bd7-493e-bb3b-3e3935c48a84-kube-api-access-fk65r\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.023177 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-client-ca\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.023204 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-config\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.023474 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-proxy-ca-bundles\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.023701 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b932d11-7bd7-493e-bb3b-3e3935c48a84-serving-cert\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.124840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b932d11-7bd7-493e-bb3b-3e3935c48a84-serving-cert\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.125001 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk65r\" (UniqueName: \"kubernetes.io/projected/5b932d11-7bd7-493e-bb3b-3e3935c48a84-kube-api-access-fk65r\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.125072 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-client-ca\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.125112 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-config\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.125163 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-proxy-ca-bundles\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.126406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-client-ca\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.126639 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-config\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.127795 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b932d11-7bd7-493e-bb3b-3e3935c48a84-proxy-ca-bundles\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.137510 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b932d11-7bd7-493e-bb3b-3e3935c48a84-serving-cert\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.157514 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk65r\" (UniqueName: \"kubernetes.io/projected/5b932d11-7bd7-493e-bb3b-3e3935c48a84-kube-api-access-fk65r\") pod \"controller-manager-b48b84df4-8p6gs\" (UID: \"5b932d11-7bd7-493e-bb3b-3e3935c48a84\") " pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.265669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.507105 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b48b84df4-8p6gs"] Oct 03 07:08:36 crc kubenswrapper[4810]: W1003 07:08:36.515471 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b932d11_7bd7_493e_bb3b_3e3935c48a84.slice/crio-5be4ee301f456151bbad04f841838408961ffe3531fbf275933632fd90a6b627 WatchSource:0}: Error finding container 5be4ee301f456151bbad04f841838408961ffe3531fbf275933632fd90a6b627: Status 404 returned error can't find the container with id 5be4ee301f456151bbad04f841838408961ffe3531fbf275933632fd90a6b627 Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.912843 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" event={"ID":"5b932d11-7bd7-493e-bb3b-3e3935c48a84","Type":"ContainerStarted","Data":"286b869e705850ce1ea3595eac16e2d7dc3dd88985e314e83f81bf909d834d56"} Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.912911 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" event={"ID":"5b932d11-7bd7-493e-bb3b-3e3935c48a84","Type":"ContainerStarted","Data":"5be4ee301f456151bbad04f841838408961ffe3531fbf275933632fd90a6b627"} Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.913309 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.918700 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" Oct 03 07:08:36 crc kubenswrapper[4810]: I1003 07:08:36.945305 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b48b84df4-8p6gs" podStartSLOduration=2.945266014 podStartE2EDuration="2.945266014s" podCreationTimestamp="2025-10-03 07:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:08:36.944163035 +0000 UTC m=+750.371413780" watchObservedRunningTime="2025-10-03 07:08:36.945266014 +0000 UTC m=+750.372516759" Oct 03 07:08:37 crc kubenswrapper[4810]: I1003 07:08:37.310917 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70715765-6777-43b0-8003-b8c047a60aed" path="/var/lib/kubelet/pods/70715765-6777-43b0-8003-b8c047a60aed/volumes" Oct 03 07:08:38 crc kubenswrapper[4810]: I1003 07:08:38.301770 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:38 crc kubenswrapper[4810]: I1003 07:08:38.302337 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:38 crc kubenswrapper[4810]: I1003 07:08:38.736991 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-h875q"] Oct 03 07:08:38 crc kubenswrapper[4810]: W1003 07:08:38.747762 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6897b018_d1be_4256_9e81_6f8836a0e303.slice/crio-5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9 WatchSource:0}: Error finding container 5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9: Status 404 returned error can't find the container with id 5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9 Oct 03 07:08:38 crc kubenswrapper[4810]: I1003 07:08:38.751602 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:08:38 crc kubenswrapper[4810]: I1003 07:08:38.927172 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h875q" event={"ID":"6897b018-d1be-4256-9e81-6f8836a0e303","Type":"ContainerStarted","Data":"5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9"} Oct 03 07:08:39 crc kubenswrapper[4810]: I1003 07:08:39.934568 4810 generic.go:334] "Generic (PLEG): container finished" podID="6897b018-d1be-4256-9e81-6f8836a0e303" containerID="8f72f1b66e40a704b17eddf7c1b2fd9c946eedbb4ca304e3a7367a3d6c63fc82" exitCode=0 Oct 03 07:08:39 crc kubenswrapper[4810]: I1003 07:08:39.934687 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h875q" event={"ID":"6897b018-d1be-4256-9e81-6f8836a0e303","Type":"ContainerDied","Data":"8f72f1b66e40a704b17eddf7c1b2fd9c946eedbb4ca304e3a7367a3d6c63fc82"} Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.054084 4810 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.308226 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.398686 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage\") pod \"6897b018-d1be-4256-9e81-6f8836a0e303\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.398728 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdhst\" (UniqueName: \"kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst\") pod \"6897b018-d1be-4256-9e81-6f8836a0e303\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.398800 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt\") pod \"6897b018-d1be-4256-9e81-6f8836a0e303\" (UID: \"6897b018-d1be-4256-9e81-6f8836a0e303\") " Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.399090 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6897b018-d1be-4256-9e81-6f8836a0e303" (UID: "6897b018-d1be-4256-9e81-6f8836a0e303"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.403284 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst" (OuterVolumeSpecName: "kube-api-access-fdhst") pod "6897b018-d1be-4256-9e81-6f8836a0e303" (UID: "6897b018-d1be-4256-9e81-6f8836a0e303"). InnerVolumeSpecName "kube-api-access-fdhst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.411780 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6897b018-d1be-4256-9e81-6f8836a0e303" (UID: "6897b018-d1be-4256-9e81-6f8836a0e303"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.500627 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdhst\" (UniqueName: \"kubernetes.io/projected/6897b018-d1be-4256-9e81-6f8836a0e303-kube-api-access-fdhst\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.500671 4810 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6897b018-d1be-4256-9e81-6f8836a0e303-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.500684 4810 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6897b018-d1be-4256-9e81-6f8836a0e303-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.949700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-h875q" event={"ID":"6897b018-d1be-4256-9e81-6f8836a0e303","Type":"ContainerDied","Data":"5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9"} Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.949750 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e3b20566341e291fd1001dc73ce47ded38b978c9e195b1279f53b2524d943e9" Oct 03 07:08:41 crc kubenswrapper[4810]: I1003 07:08:41.949792 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-h875q" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.853696 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2"] Oct 03 07:08:49 crc kubenswrapper[4810]: E1003 07:08:49.855505 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" containerName="storage" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.855578 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" containerName="storage" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.855728 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" containerName="storage" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.856666 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.860535 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.866777 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2"] Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.923701 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.923767 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25hr\" (UniqueName: \"kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:49 crc kubenswrapper[4810]: I1003 07:08:49.923797 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.024974 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.025305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q25hr\" (UniqueName: \"kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.025468 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.025554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.025861 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.052030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25hr\" (UniqueName: \"kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.175555 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:50 crc kubenswrapper[4810]: I1003 07:08:50.649013 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2"] Oct 03 07:08:51 crc kubenswrapper[4810]: I1003 07:08:51.010697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerStarted","Data":"8830ef4e5998c5e51b537d2018daa0d971693aed6710508cdcee49d3a68b1ac5"} Oct 03 07:08:51 crc kubenswrapper[4810]: I1003 07:08:51.011159 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerStarted","Data":"3dc0e79116ef70800e18d06e7dd0f9379a3c1b768292fa145fa50257f11de028"} Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.022094 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerID="8830ef4e5998c5e51b537d2018daa0d971693aed6710508cdcee49d3a68b1ac5" exitCode=0 Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.022185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerDied","Data":"8830ef4e5998c5e51b537d2018daa0d971693aed6710508cdcee49d3a68b1ac5"} Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.202165 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.204362 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.230409 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.256276 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.256376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.256608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfjv\" (UniqueName: \"kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.358272 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.358354 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.358446 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfjv\" (UniqueName: \"kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.359013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.360001 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.383105 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfjv\" (UniqueName: \"kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv\") pod \"redhat-operators-zn4mj\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.530639 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:08:52 crc kubenswrapper[4810]: I1003 07:08:52.963888 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:08:53 crc kubenswrapper[4810]: W1003 07:08:53.026230 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519ff265_8d4c_4699_b915_d4b76723ac06.slice/crio-e59c7ba0c5ca61d148bdafb3cca54fcca7e9e96a7c37c0877b0f2c74cbd8c667 WatchSource:0}: Error finding container e59c7ba0c5ca61d148bdafb3cca54fcca7e9e96a7c37c0877b0f2c74cbd8c667: Status 404 returned error can't find the container with id e59c7ba0c5ca61d148bdafb3cca54fcca7e9e96a7c37c0877b0f2c74cbd8c667 Oct 03 07:08:54 crc kubenswrapper[4810]: I1003 07:08:54.039119 4810 generic.go:334] "Generic (PLEG): container finished" podID="519ff265-8d4c-4699-b915-d4b76723ac06" containerID="cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e" exitCode=0 Oct 03 07:08:54 crc kubenswrapper[4810]: I1003 07:08:54.039324 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerDied","Data":"cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e"} Oct 03 07:08:54 crc kubenswrapper[4810]: I1003 07:08:54.039388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerStarted","Data":"e59c7ba0c5ca61d148bdafb3cca54fcca7e9e96a7c37c0877b0f2c74cbd8c667"} Oct 03 07:08:54 crc kubenswrapper[4810]: I1003 07:08:54.046678 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerID="2253a97372485ce7ba9dbcd2ff5187e32c86da39dc1e1d15c8dbd2874943f274" exitCode=0 Oct 03 07:08:54 crc kubenswrapper[4810]: I1003 07:08:54.046957 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerDied","Data":"2253a97372485ce7ba9dbcd2ff5187e32c86da39dc1e1d15c8dbd2874943f274"} Oct 03 07:08:55 crc kubenswrapper[4810]: I1003 07:08:55.056652 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerID="52c64000dcaa0a52b80258cb553aa8793b9044c54d3052019ebc2efd26dbdfd2" exitCode=0 Oct 03 07:08:55 crc kubenswrapper[4810]: I1003 07:08:55.056826 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerDied","Data":"52c64000dcaa0a52b80258cb553aa8793b9044c54d3052019ebc2efd26dbdfd2"} Oct 03 07:08:55 crc kubenswrapper[4810]: I1003 07:08:55.060568 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerStarted","Data":"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac"} Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.073885 4810 generic.go:334] "Generic (PLEG): container finished" podID="519ff265-8d4c-4699-b915-d4b76723ac06" containerID="cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac" exitCode=0 Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.073982 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerDied","Data":"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac"} Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.526264 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.625593 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q25hr\" (UniqueName: \"kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr\") pod \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.625653 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util\") pod \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.625696 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle\") pod \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\" (UID: \"dbf4ec42-1c90-4cc4-a6da-33d98660eea3\") " Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.626647 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle" (OuterVolumeSpecName: "bundle") pod "dbf4ec42-1c90-4cc4-a6da-33d98660eea3" (UID: "dbf4ec42-1c90-4cc4-a6da-33d98660eea3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.632405 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr" (OuterVolumeSpecName: "kube-api-access-q25hr") pod "dbf4ec42-1c90-4cc4-a6da-33d98660eea3" (UID: "dbf4ec42-1c90-4cc4-a6da-33d98660eea3"). InnerVolumeSpecName "kube-api-access-q25hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.726841 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q25hr\" (UniqueName: \"kubernetes.io/projected/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-kube-api-access-q25hr\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.726880 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.807821 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util" (OuterVolumeSpecName: "util") pod "dbf4ec42-1c90-4cc4-a6da-33d98660eea3" (UID: "dbf4ec42-1c90-4cc4-a6da-33d98660eea3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:08:56 crc kubenswrapper[4810]: I1003 07:08:56.828141 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dbf4ec42-1c90-4cc4-a6da-33d98660eea3-util\") on node \"crc\" DevicePath \"\"" Oct 03 07:08:57 crc kubenswrapper[4810]: I1003 07:08:57.084123 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerStarted","Data":"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133"} Oct 03 07:08:57 crc kubenswrapper[4810]: I1003 07:08:57.089247 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" event={"ID":"dbf4ec42-1c90-4cc4-a6da-33d98660eea3","Type":"ContainerDied","Data":"3dc0e79116ef70800e18d06e7dd0f9379a3c1b768292fa145fa50257f11de028"} Oct 03 07:08:57 crc kubenswrapper[4810]: I1003 07:08:57.089303 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dc0e79116ef70800e18d06e7dd0f9379a3c1b768292fa145fa50257f11de028" Oct 03 07:08:57 crc kubenswrapper[4810]: I1003 07:08:57.089351 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2" Oct 03 07:08:57 crc kubenswrapper[4810]: I1003 07:08:57.116386 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zn4mj" podStartSLOduration=2.627355738 podStartE2EDuration="5.116360851s" podCreationTimestamp="2025-10-03 07:08:52 +0000 UTC" firstStartedPulling="2025-10-03 07:08:54.043955134 +0000 UTC m=+767.471205909" lastFinishedPulling="2025-10-03 07:08:56.532960277 +0000 UTC m=+769.960211022" observedRunningTime="2025-10-03 07:08:57.112394463 +0000 UTC m=+770.539645228" watchObservedRunningTime="2025-10-03 07:08:57.116360851 +0000 UTC m=+770.543611596" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.340802 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk"] Oct 03 07:09:00 crc kubenswrapper[4810]: E1003 07:09:00.341501 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="pull" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.341521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="pull" Oct 03 07:09:00 crc kubenswrapper[4810]: E1003 07:09:00.341547 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="util" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.341574 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="util" Oct 03 07:09:00 crc kubenswrapper[4810]: E1003 07:09:00.341594 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="extract" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.341604 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="extract" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.341740 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf4ec42-1c90-4cc4-a6da-33d98660eea3" containerName="extract" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.342271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.346483 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.346758 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tdkbs" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.349606 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.352471 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk"] Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.490678 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khg55\" (UniqueName: \"kubernetes.io/projected/1a6bb43f-fea9-4fd4-86c6-50fe6845ec24-kube-api-access-khg55\") pod \"nmstate-operator-858ddd8f98-pcfqk\" (UID: \"1a6bb43f-fea9-4fd4-86c6-50fe6845ec24\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.592453 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khg55\" (UniqueName: \"kubernetes.io/projected/1a6bb43f-fea9-4fd4-86c6-50fe6845ec24-kube-api-access-khg55\") pod \"nmstate-operator-858ddd8f98-pcfqk\" (UID: \"1a6bb43f-fea9-4fd4-86c6-50fe6845ec24\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.615295 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khg55\" (UniqueName: \"kubernetes.io/projected/1a6bb43f-fea9-4fd4-86c6-50fe6845ec24-kube-api-access-khg55\") pod \"nmstate-operator-858ddd8f98-pcfqk\" (UID: \"1a6bb43f-fea9-4fd4-86c6-50fe6845ec24\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" Oct 03 07:09:00 crc kubenswrapper[4810]: I1003 07:09:00.667679 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" Oct 03 07:09:01 crc kubenswrapper[4810]: I1003 07:09:01.104822 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk"] Oct 03 07:09:01 crc kubenswrapper[4810]: W1003 07:09:01.115048 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a6bb43f_fea9_4fd4_86c6_50fe6845ec24.slice/crio-a41daedacbc13943eaef11e7bd93e41ae8d86ff7570b29047a45d694bd9bf4f6 WatchSource:0}: Error finding container a41daedacbc13943eaef11e7bd93e41ae8d86ff7570b29047a45d694bd9bf4f6: Status 404 returned error can't find the container with id a41daedacbc13943eaef11e7bd93e41ae8d86ff7570b29047a45d694bd9bf4f6 Oct 03 07:09:02 crc kubenswrapper[4810]: I1003 07:09:02.139319 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" event={"ID":"1a6bb43f-fea9-4fd4-86c6-50fe6845ec24","Type":"ContainerStarted","Data":"a41daedacbc13943eaef11e7bd93e41ae8d86ff7570b29047a45d694bd9bf4f6"} Oct 03 07:09:02 crc kubenswrapper[4810]: I1003 07:09:02.545586 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:02 crc kubenswrapper[4810]: I1003 07:09:02.545633 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:02 crc kubenswrapper[4810]: I1003 07:09:02.601540 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:03 crc kubenswrapper[4810]: I1003 07:09:03.191501 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:04 crc kubenswrapper[4810]: I1003 07:09:04.159879 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" event={"ID":"1a6bb43f-fea9-4fd4-86c6-50fe6845ec24","Type":"ContainerStarted","Data":"8e129d014f17a53415dbac1070d87aca4bbfd8ede60c95f4d56abb444ea22e30"} Oct 03 07:09:04 crc kubenswrapper[4810]: I1003 07:09:04.188595 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pcfqk" podStartSLOduration=1.92603316 podStartE2EDuration="4.188561944s" podCreationTimestamp="2025-10-03 07:09:00 +0000 UTC" firstStartedPulling="2025-10-03 07:09:01.125182421 +0000 UTC m=+774.552433156" lastFinishedPulling="2025-10-03 07:09:03.387711205 +0000 UTC m=+776.814961940" observedRunningTime="2025-10-03 07:09:04.187213188 +0000 UTC m=+777.614463943" watchObservedRunningTime="2025-10-03 07:09:04.188561944 +0000 UTC m=+777.615812719" Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.192579 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.192947 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zn4mj" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="registry-server" containerID="cri-o://5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133" gracePeriod=2 Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.723082 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.860105 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfjv\" (UniqueName: \"kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv\") pod \"519ff265-8d4c-4699-b915-d4b76723ac06\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.860176 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content\") pod \"519ff265-8d4c-4699-b915-d4b76723ac06\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.860239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities\") pod \"519ff265-8d4c-4699-b915-d4b76723ac06\" (UID: \"519ff265-8d4c-4699-b915-d4b76723ac06\") " Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.861719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities" (OuterVolumeSpecName: "utilities") pod "519ff265-8d4c-4699-b915-d4b76723ac06" (UID: "519ff265-8d4c-4699-b915-d4b76723ac06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.868692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv" (OuterVolumeSpecName: "kube-api-access-nkfjv") pod "519ff265-8d4c-4699-b915-d4b76723ac06" (UID: "519ff265-8d4c-4699-b915-d4b76723ac06"). InnerVolumeSpecName "kube-api-access-nkfjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.962078 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfjv\" (UniqueName: \"kubernetes.io/projected/519ff265-8d4c-4699-b915-d4b76723ac06-kube-api-access-nkfjv\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:05 crc kubenswrapper[4810]: I1003 07:09:05.962146 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.172138 4810 generic.go:334] "Generic (PLEG): container finished" podID="519ff265-8d4c-4699-b915-d4b76723ac06" containerID="5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133" exitCode=0 Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.172180 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerDied","Data":"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133"} Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.172225 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zn4mj" event={"ID":"519ff265-8d4c-4699-b915-d4b76723ac06","Type":"ContainerDied","Data":"e59c7ba0c5ca61d148bdafb3cca54fcca7e9e96a7c37c0877b0f2c74cbd8c667"} Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.172244 4810 scope.go:117] "RemoveContainer" containerID="5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.173120 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zn4mj" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.191705 4810 scope.go:117] "RemoveContainer" containerID="cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.213725 4810 scope.go:117] "RemoveContainer" containerID="cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.234816 4810 scope.go:117] "RemoveContainer" containerID="5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133" Oct 03 07:09:06 crc kubenswrapper[4810]: E1003 07:09:06.235441 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133\": container with ID starting with 5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133 not found: ID does not exist" containerID="5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.235550 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133"} err="failed to get container status \"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133\": rpc error: code = NotFound desc = could not find container \"5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133\": container with ID starting with 5f812cc6fcc2fff68b01460c82d658b7f81e1055b2f82d00392ed3e7d358a133 not found: ID does not exist" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.235623 4810 scope.go:117] "RemoveContainer" containerID="cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac" Oct 03 07:09:06 crc kubenswrapper[4810]: E1003 07:09:06.236464 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac\": container with ID starting with cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac not found: ID does not exist" containerID="cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.236531 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac"} err="failed to get container status \"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac\": rpc error: code = NotFound desc = could not find container \"cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac\": container with ID starting with cd8ca73b9c73f5d29d88c2dd13579818f588ed102c8dad45984d6ac47888f8ac not found: ID does not exist" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.236590 4810 scope.go:117] "RemoveContainer" containerID="cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e" Oct 03 07:09:06 crc kubenswrapper[4810]: E1003 07:09:06.237227 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e\": container with ID starting with cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e not found: ID does not exist" containerID="cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.237272 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e"} err="failed to get container status \"cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e\": rpc error: code = NotFound desc = could not find container \"cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e\": container with ID starting with cfcd1a5e40b95a1fa5eb627eec46ea8a341d5f638f1f96ced4282cac9013975e not found: ID does not exist" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.725088 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "519ff265-8d4c-4699-b915-d4b76723ac06" (UID: "519ff265-8d4c-4699-b915-d4b76723ac06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.773608 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/519ff265-8d4c-4699-b915-d4b76723ac06-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.822658 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:09:06 crc kubenswrapper[4810]: I1003 07:09:06.826143 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zn4mj"] Oct 03 07:09:07 crc kubenswrapper[4810]: I1003 07:09:07.315640 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" path="/var/lib/kubelet/pods/519ff265-8d4c-4699-b915-d4b76723ac06/volumes" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.602347 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:09 crc kubenswrapper[4810]: E1003 07:09:09.602991 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="extract-utilities" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.603006 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="extract-utilities" Oct 03 07:09:09 crc kubenswrapper[4810]: E1003 07:09:09.603019 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="extract-content" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.603030 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="extract-content" Oct 03 07:09:09 crc kubenswrapper[4810]: E1003 07:09:09.603045 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="registry-server" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.603054 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="registry-server" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.603183 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="519ff265-8d4c-4699-b915-d4b76723ac06" containerName="registry-server" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.604055 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.622568 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.717533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.717705 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctvr6\" (UniqueName: \"kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.717750 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.818666 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctvr6\" (UniqueName: \"kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.818731 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.818773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.819291 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.819331 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.843600 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctvr6\" (UniqueName: \"kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6\") pod \"redhat-marketplace-sfxxp\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:09 crc kubenswrapper[4810]: I1003 07:09:09.943157 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.295450 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.297055 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.301146 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-drd4d" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.313234 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.316707 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.317348 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.319113 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.349792 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.352821 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8ztlc"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.353607 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.427237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4x65\" (UniqueName: \"kubernetes.io/projected/98693660-7374-45d1-bc6c-677bb3532d3c-kube-api-access-k4x65\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428083 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-dbus-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-nmstate-lock\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428523 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bf55\" (UniqueName: \"kubernetes.io/projected/27a427fb-1f71-4935-89e7-764400c772c9-kube-api-access-6bf55\") pod \"nmstate-metrics-fdff9cb8d-fc6mm\" (UID: \"27a427fb-1f71-4935-89e7-764400c772c9\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428591 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/332c430e-f6e1-4068-8b12-d0a8bc3e1183-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428622 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-ovs-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.428752 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c7pg\" (UniqueName: \"kubernetes.io/projected/332c430e-f6e1-4068-8b12-d0a8bc3e1183-kube-api-access-5c7pg\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.450711 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.451518 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.453846 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.454233 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-l7cv5" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.454437 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.461219 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.471529 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:10 crc kubenswrapper[4810]: W1003 07:09:10.481710 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585b8930_ba1b_43ef_a57e_458a63c2ca8c.slice/crio-d115bd1089b5468fc9364156aef199d1bbb86f407690db7c55830a3009d2f016 WatchSource:0}: Error finding container d115bd1089b5468fc9364156aef199d1bbb86f407690db7c55830a3009d2f016: Status 404 returned error can't find the container with id d115bd1089b5468fc9364156aef199d1bbb86f407690db7c55830a3009d2f016 Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530100 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c7pg\" (UniqueName: \"kubernetes.io/projected/332c430e-f6e1-4068-8b12-d0a8bc3e1183-kube-api-access-5c7pg\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530162 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4x65\" (UniqueName: \"kubernetes.io/projected/98693660-7374-45d1-bc6c-677bb3532d3c-kube-api-access-k4x65\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530199 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-dbus-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530229 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-nmstate-lock\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530269 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqqs\" (UniqueName: \"kubernetes.io/projected/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-kube-api-access-hdqqs\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530305 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-nmstate-lock\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530376 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bf55\" (UniqueName: \"kubernetes.io/projected/27a427fb-1f71-4935-89e7-764400c772c9-kube-api-access-6bf55\") pod \"nmstate-metrics-fdff9cb8d-fc6mm\" (UID: \"27a427fb-1f71-4935-89e7-764400c772c9\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530444 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530484 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/332c430e-f6e1-4068-8b12-d0a8bc3e1183-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530516 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-ovs-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530599 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-ovs-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.530609 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/98693660-7374-45d1-bc6c-677bb3532d3c-dbus-socket\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.538422 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/332c430e-f6e1-4068-8b12-d0a8bc3e1183-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.548791 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c7pg\" (UniqueName: \"kubernetes.io/projected/332c430e-f6e1-4068-8b12-d0a8bc3e1183-kube-api-access-5c7pg\") pod \"nmstate-webhook-6cdbc54649-9qx25\" (UID: \"332c430e-f6e1-4068-8b12-d0a8bc3e1183\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.552569 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4x65\" (UniqueName: \"kubernetes.io/projected/98693660-7374-45d1-bc6c-677bb3532d3c-kube-api-access-k4x65\") pod \"nmstate-handler-8ztlc\" (UID: \"98693660-7374-45d1-bc6c-677bb3532d3c\") " pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.552963 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bf55\" (UniqueName: \"kubernetes.io/projected/27a427fb-1f71-4935-89e7-764400c772c9-kube-api-access-6bf55\") pod \"nmstate-metrics-fdff9cb8d-fc6mm\" (UID: \"27a427fb-1f71-4935-89e7-764400c772c9\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.615715 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.632193 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqqs\" (UniqueName: \"kubernetes.io/projected/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-kube-api-access-hdqqs\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.632275 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.632324 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: E1003 07:09:10.632739 4810 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 03 07:09:10 crc kubenswrapper[4810]: E1003 07:09:10.632877 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert podName:8d16767b-6c32-442f-b6b6-6c6cc78e2c25 nodeName:}" failed. No retries permitted until 2025-10-03 07:09:11.132858492 +0000 UTC m=+784.560109227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-sjq8r" (UID: "8d16767b-6c32-442f-b6b6-6c6cc78e2c25") : secret "plugin-serving-cert" not found Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.632981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.633921 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.648779 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bc65d5d98-4p6lt"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.649669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.653375 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqqs\" (UniqueName: \"kubernetes.io/projected/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-kube-api-access-hdqqs\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.673758 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc65d5d98-4p6lt"] Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.681152 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735516 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-oauth-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-service-ca\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735576 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwbww\" (UniqueName: \"kubernetes.io/projected/7736fe75-c536-4ba3-a03c-d2df6d90f32b-kube-api-access-jwbww\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735594 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735612 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-trusted-ca-bundle\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.735631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-oauth-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838008 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-oauth-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838486 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-service-ca\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwbww\" (UniqueName: \"kubernetes.io/projected/7736fe75-c536-4ba3-a03c-d2df6d90f32b-kube-api-access-jwbww\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-trusted-ca-bundle\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838591 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-oauth-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.838648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.839686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-oauth-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.839754 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-service-ca\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.840166 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.840526 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7736fe75-c536-4ba3-a03c-d2df6d90f32b-trusted-ca-bundle\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.845829 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-serving-cert\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.846638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7736fe75-c536-4ba3-a03c-d2df6d90f32b-console-oauth-config\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.864608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwbww\" (UniqueName: \"kubernetes.io/projected/7736fe75-c536-4ba3-a03c-d2df6d90f32b-kube-api-access-jwbww\") pod \"console-bc65d5d98-4p6lt\" (UID: \"7736fe75-c536-4ba3-a03c-d2df6d90f32b\") " pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:10 crc kubenswrapper[4810]: I1003 07:09:10.975258 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.137018 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm"] Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.142886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.150403 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8d16767b-6c32-442f-b6b6-6c6cc78e2c25-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-sjq8r\" (UID: \"8d16767b-6c32-442f-b6b6-6c6cc78e2c25\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.212793 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25"] Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.214821 4810 generic.go:334] "Generic (PLEG): container finished" podID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerID="537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9" exitCode=0 Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.214943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerDied","Data":"537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9"} Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.215034 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerStarted","Data":"d115bd1089b5468fc9364156aef199d1bbb86f407690db7c55830a3009d2f016"} Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.216797 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" event={"ID":"27a427fb-1f71-4935-89e7-764400c772c9","Type":"ContainerStarted","Data":"bcdf9a58c83f2a72af96aeaff5a68c36f9fb61acd9661b466c710fe558a5b68d"} Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.217821 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ztlc" event={"ID":"98693660-7374-45d1-bc6c-677bb3532d3c","Type":"ContainerStarted","Data":"1ccdfffd36ad762ce9ed014417b7d3eabc192e914346ebd03de7c358d2f741f6"} Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.368517 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.412054 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bc65d5d98-4p6lt"] Oct 03 07:09:11 crc kubenswrapper[4810]: W1003 07:09:11.417449 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7736fe75_c536_4ba3_a03c_d2df6d90f32b.slice/crio-69068a843fb2a270027a6308a52b25c61a65dbd52f6970b30a229af36037d6ae WatchSource:0}: Error finding container 69068a843fb2a270027a6308a52b25c61a65dbd52f6970b30a229af36037d6ae: Status 404 returned error can't find the container with id 69068a843fb2a270027a6308a52b25c61a65dbd52f6970b30a229af36037d6ae Oct 03 07:09:11 crc kubenswrapper[4810]: I1003 07:09:11.804752 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r"] Oct 03 07:09:11 crc kubenswrapper[4810]: W1003 07:09:11.811445 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d16767b_6c32_442f_b6b6_6c6cc78e2c25.slice/crio-add4d9376df94482f22b8e0f78b6ec141f41fe065d4269f8ed5c7254ce4070c0 WatchSource:0}: Error finding container add4d9376df94482f22b8e0f78b6ec141f41fe065d4269f8ed5c7254ce4070c0: Status 404 returned error can't find the container with id add4d9376df94482f22b8e0f78b6ec141f41fe065d4269f8ed5c7254ce4070c0 Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.223682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" event={"ID":"332c430e-f6e1-4068-8b12-d0a8bc3e1183","Type":"ContainerStarted","Data":"05ad5e97eaa0a93148049028d3fd1cbde4a66d82ea25cac2d8b8dbb4d9a659b5"} Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.227746 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc65d5d98-4p6lt" event={"ID":"7736fe75-c536-4ba3-a03c-d2df6d90f32b","Type":"ContainerStarted","Data":"20bc51df8697a8c5492dd194eaa8727c6b59457dbaca2f3aed434af32cbd4a2e"} Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.227791 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bc65d5d98-4p6lt" event={"ID":"7736fe75-c536-4ba3-a03c-d2df6d90f32b","Type":"ContainerStarted","Data":"69068a843fb2a270027a6308a52b25c61a65dbd52f6970b30a229af36037d6ae"} Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.230853 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" event={"ID":"8d16767b-6c32-442f-b6b6-6c6cc78e2c25","Type":"ContainerStarted","Data":"add4d9376df94482f22b8e0f78b6ec141f41fe065d4269f8ed5c7254ce4070c0"} Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.235594 4810 generic.go:334] "Generic (PLEG): container finished" podID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerID="d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115" exitCode=0 Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.235638 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerDied","Data":"d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115"} Oct 03 07:09:12 crc kubenswrapper[4810]: I1003 07:09:12.243533 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bc65d5d98-4p6lt" podStartSLOduration=2.243511242 podStartE2EDuration="2.243511242s" podCreationTimestamp="2025-10-03 07:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:09:12.240883581 +0000 UTC m=+785.668134336" watchObservedRunningTime="2025-10-03 07:09:12.243511242 +0000 UTC m=+785.670761977" Oct 03 07:09:13 crc kubenswrapper[4810]: I1003 07:09:13.245845 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerStarted","Data":"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33"} Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.255746 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" event={"ID":"332c430e-f6e1-4068-8b12-d0a8bc3e1183","Type":"ContainerStarted","Data":"bdf7e158ab4e067f7920248ec013b16448dfc2d4eb17c081ebcbd4cf0c7aab5a"} Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.256250 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.258998 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" event={"ID":"27a427fb-1f71-4935-89e7-764400c772c9","Type":"ContainerStarted","Data":"893ff85e4e13898d3bee761b02828541f110f49ef364f4e850516d5aefb0ab07"} Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.261664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8ztlc" event={"ID":"98693660-7374-45d1-bc6c-677bb3532d3c","Type":"ContainerStarted","Data":"401af7e549fa0fbee8e15aebc53cd5a4d3eec0cc6a0525f4c719f30c9b0b9db4"} Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.261839 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.278207 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfxxp" podStartSLOduration=3.498927504 podStartE2EDuration="5.278159389s" podCreationTimestamp="2025-10-03 07:09:09 +0000 UTC" firstStartedPulling="2025-10-03 07:09:11.220739626 +0000 UTC m=+784.647990361" lastFinishedPulling="2025-10-03 07:09:12.999971511 +0000 UTC m=+786.427222246" observedRunningTime="2025-10-03 07:09:13.265777804 +0000 UTC m=+786.693028549" watchObservedRunningTime="2025-10-03 07:09:14.278159389 +0000 UTC m=+787.705410144" Oct 03 07:09:14 crc kubenswrapper[4810]: I1003 07:09:14.278387 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" podStartSLOduration=2.356270598 podStartE2EDuration="4.278381944s" podCreationTimestamp="2025-10-03 07:09:10 +0000 UTC" firstStartedPulling="2025-10-03 07:09:11.232564116 +0000 UTC m=+784.659814851" lastFinishedPulling="2025-10-03 07:09:13.154675462 +0000 UTC m=+786.581926197" observedRunningTime="2025-10-03 07:09:14.277138461 +0000 UTC m=+787.704389246" watchObservedRunningTime="2025-10-03 07:09:14.278381944 +0000 UTC m=+787.705632689" Oct 03 07:09:15 crc kubenswrapper[4810]: I1003 07:09:15.270333 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" event={"ID":"8d16767b-6c32-442f-b6b6-6c6cc78e2c25","Type":"ContainerStarted","Data":"3eaf38feda99befb4a03168ed2cf9f4383f5757ef5ac22f129ade8d0c2b3fac7"} Oct 03 07:09:15 crc kubenswrapper[4810]: I1003 07:09:15.286982 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-sjq8r" podStartSLOduration=2.921546282 podStartE2EDuration="5.286961867s" podCreationTimestamp="2025-10-03 07:09:10 +0000 UTC" firstStartedPulling="2025-10-03 07:09:11.81416713 +0000 UTC m=+785.241417865" lastFinishedPulling="2025-10-03 07:09:14.179582715 +0000 UTC m=+787.606833450" observedRunningTime="2025-10-03 07:09:15.283915164 +0000 UTC m=+788.711165909" watchObservedRunningTime="2025-10-03 07:09:15.286961867 +0000 UTC m=+788.714212612" Oct 03 07:09:15 crc kubenswrapper[4810]: I1003 07:09:15.288716 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8ztlc" podStartSLOduration=2.909931169 podStartE2EDuration="5.288708704s" podCreationTimestamp="2025-10-03 07:09:10 +0000 UTC" firstStartedPulling="2025-10-03 07:09:10.79414715 +0000 UTC m=+784.221397885" lastFinishedPulling="2025-10-03 07:09:13.172924675 +0000 UTC m=+786.600175420" observedRunningTime="2025-10-03 07:09:14.300106852 +0000 UTC m=+787.727357627" watchObservedRunningTime="2025-10-03 07:09:15.288708704 +0000 UTC m=+788.715959459" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.288116 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" event={"ID":"27a427fb-1f71-4935-89e7-764400c772c9","Type":"ContainerStarted","Data":"52d2b49be3a66723ba9899e0cf8a42232b377656370357d79c086645bebcb3bd"} Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.310758 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-fc6mm" podStartSLOduration=1.923533116 podStartE2EDuration="6.310739159s" podCreationTimestamp="2025-10-03 07:09:10 +0000 UTC" firstStartedPulling="2025-10-03 07:09:11.151347711 +0000 UTC m=+784.578598446" lastFinishedPulling="2025-10-03 07:09:15.538553744 +0000 UTC m=+788.965804489" observedRunningTime="2025-10-03 07:09:16.308028956 +0000 UTC m=+789.735279761" watchObservedRunningTime="2025-10-03 07:09:16.310739159 +0000 UTC m=+789.737989884" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.416580 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.418995 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.438832 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.524708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8rl\" (UniqueName: \"kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.524807 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.524881 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.626031 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.626361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8rl\" (UniqueName: \"kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.626507 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.626696 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.626936 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.664991 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8rl\" (UniqueName: \"kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl\") pod \"certified-operators-f7gkr\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:16 crc kubenswrapper[4810]: I1003 07:09:16.752557 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:17 crc kubenswrapper[4810]: I1003 07:09:17.026754 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:17 crc kubenswrapper[4810]: W1003 07:09:17.033083 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39351db9_e4b0_4a6b_869d_ddee8abb7c3d.slice/crio-b53e21410dacf940e2dd31bbec8f12b80680e2642cc341aa8666ff023ff0dc85 WatchSource:0}: Error finding container b53e21410dacf940e2dd31bbec8f12b80680e2642cc341aa8666ff023ff0dc85: Status 404 returned error can't find the container with id b53e21410dacf940e2dd31bbec8f12b80680e2642cc341aa8666ff023ff0dc85 Oct 03 07:09:17 crc kubenswrapper[4810]: I1003 07:09:17.296632 4810 generic.go:334] "Generic (PLEG): container finished" podID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerID="6a291ebca7bfaaf7f33276f847bfe033130a59d28ea46d9d5baf267f53dc8532" exitCode=0 Oct 03 07:09:17 crc kubenswrapper[4810]: I1003 07:09:17.296760 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerDied","Data":"6a291ebca7bfaaf7f33276f847bfe033130a59d28ea46d9d5baf267f53dc8532"} Oct 03 07:09:17 crc kubenswrapper[4810]: I1003 07:09:17.296828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerStarted","Data":"b53e21410dacf940e2dd31bbec8f12b80680e2642cc341aa8666ff023ff0dc85"} Oct 03 07:09:19 crc kubenswrapper[4810]: I1003 07:09:19.310336 4810 generic.go:334] "Generic (PLEG): container finished" podID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerID="40a6f6a0e7aaaf71bda51fc0b88b8e76290dc4c71a86d921b96262e5441fa339" exitCode=0 Oct 03 07:09:19 crc kubenswrapper[4810]: I1003 07:09:19.318500 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerDied","Data":"40a6f6a0e7aaaf71bda51fc0b88b8e76290dc4c71a86d921b96262e5441fa339"} Oct 03 07:09:19 crc kubenswrapper[4810]: I1003 07:09:19.944282 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:19 crc kubenswrapper[4810]: I1003 07:09:19.944546 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:19 crc kubenswrapper[4810]: I1003 07:09:19.990876 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.319147 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerStarted","Data":"06daf077c2ade239727d8c34e27d7d13a0b699f6bf43326782f83f4a4f7de9a2"} Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.347246 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f7gkr" podStartSLOduration=1.892217882 podStartE2EDuration="4.347217766s" podCreationTimestamp="2025-10-03 07:09:16 +0000 UTC" firstStartedPulling="2025-10-03 07:09:17.299545237 +0000 UTC m=+790.726796002" lastFinishedPulling="2025-10-03 07:09:19.754545161 +0000 UTC m=+793.181795886" observedRunningTime="2025-10-03 07:09:20.342084717 +0000 UTC m=+793.769335512" watchObservedRunningTime="2025-10-03 07:09:20.347217766 +0000 UTC m=+793.774468541" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.360494 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.703487 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8ztlc" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.976064 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.976137 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:20 crc kubenswrapper[4810]: I1003 07:09:20.981960 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:21 crc kubenswrapper[4810]: I1003 07:09:21.336270 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bc65d5d98-4p6lt" Oct 03 07:09:21 crc kubenswrapper[4810]: I1003 07:09:21.420519 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 07:09:22 crc kubenswrapper[4810]: I1003 07:09:22.392888 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.344144 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfxxp" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="registry-server" containerID="cri-o://90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33" gracePeriod=2 Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.761216 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.832026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctvr6\" (UniqueName: \"kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6\") pod \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.832149 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities\") pod \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.832201 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content\") pod \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\" (UID: \"585b8930-ba1b-43ef-a57e-458a63c2ca8c\") " Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.834487 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities" (OuterVolumeSpecName: "utilities") pod "585b8930-ba1b-43ef-a57e-458a63c2ca8c" (UID: "585b8930-ba1b-43ef-a57e-458a63c2ca8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.843207 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6" (OuterVolumeSpecName: "kube-api-access-ctvr6") pod "585b8930-ba1b-43ef-a57e-458a63c2ca8c" (UID: "585b8930-ba1b-43ef-a57e-458a63c2ca8c"). InnerVolumeSpecName "kube-api-access-ctvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.845365 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "585b8930-ba1b-43ef-a57e-458a63c2ca8c" (UID: "585b8930-ba1b-43ef-a57e-458a63c2ca8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.934675 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.934734 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/585b8930-ba1b-43ef-a57e-458a63c2ca8c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:23 crc kubenswrapper[4810]: I1003 07:09:23.934762 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctvr6\" (UniqueName: \"kubernetes.io/projected/585b8930-ba1b-43ef-a57e-458a63c2ca8c-kube-api-access-ctvr6\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.353567 4810 generic.go:334] "Generic (PLEG): container finished" podID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerID="90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33" exitCode=0 Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.353604 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfxxp" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.353625 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerDied","Data":"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33"} Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.353675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfxxp" event={"ID":"585b8930-ba1b-43ef-a57e-458a63c2ca8c","Type":"ContainerDied","Data":"d115bd1089b5468fc9364156aef199d1bbb86f407690db7c55830a3009d2f016"} Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.353705 4810 scope.go:117] "RemoveContainer" containerID="90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.384382 4810 scope.go:117] "RemoveContainer" containerID="d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.406485 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.416755 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfxxp"] Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.419212 4810 scope.go:117] "RemoveContainer" containerID="537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.440353 4810 scope.go:117] "RemoveContainer" containerID="90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33" Oct 03 07:09:24 crc kubenswrapper[4810]: E1003 07:09:24.440844 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33\": container with ID starting with 90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33 not found: ID does not exist" containerID="90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.440885 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33"} err="failed to get container status \"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33\": rpc error: code = NotFound desc = could not find container \"90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33\": container with ID starting with 90f3d72bb048aa58c037275f88ba62be1cd03a895a06112bd3f6ebaafebd3a33 not found: ID does not exist" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.440927 4810 scope.go:117] "RemoveContainer" containerID="d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115" Oct 03 07:09:24 crc kubenswrapper[4810]: E1003 07:09:24.441702 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115\": container with ID starting with d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115 not found: ID does not exist" containerID="d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.441768 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115"} err="failed to get container status \"d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115\": rpc error: code = NotFound desc = could not find container \"d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115\": container with ID starting with d75a3717ce859060d77d362a8eb64097ab51d0854590db17c9a003347d533115 not found: ID does not exist" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.441817 4810 scope.go:117] "RemoveContainer" containerID="537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9" Oct 03 07:09:24 crc kubenswrapper[4810]: E1003 07:09:24.443188 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9\": container with ID starting with 537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9 not found: ID does not exist" containerID="537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9" Oct 03 07:09:24 crc kubenswrapper[4810]: I1003 07:09:24.443215 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9"} err="failed to get container status \"537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9\": rpc error: code = NotFound desc = could not find container \"537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9\": container with ID starting with 537e67bb929733a15cb0c384fcc283fac50d7138f0cdbacc2a7d05b932ce84c9 not found: ID does not exist" Oct 03 07:09:25 crc kubenswrapper[4810]: I1003 07:09:25.314217 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" path="/var/lib/kubelet/pods/585b8930-ba1b-43ef-a57e-458a63c2ca8c/volumes" Oct 03 07:09:26 crc kubenswrapper[4810]: I1003 07:09:26.753722 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:26 crc kubenswrapper[4810]: I1003 07:09:26.753802 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:26 crc kubenswrapper[4810]: I1003 07:09:26.818476 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:27 crc kubenswrapper[4810]: I1003 07:09:27.410463 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:27 crc kubenswrapper[4810]: I1003 07:09:27.997503 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:29 crc kubenswrapper[4810]: I1003 07:09:29.390216 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f7gkr" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="registry-server" containerID="cri-o://06daf077c2ade239727d8c34e27d7d13a0b699f6bf43326782f83f4a4f7de9a2" gracePeriod=2 Oct 03 07:09:30 crc kubenswrapper[4810]: I1003 07:09:30.402297 4810 generic.go:334] "Generic (PLEG): container finished" podID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerID="06daf077c2ade239727d8c34e27d7d13a0b699f6bf43326782f83f4a4f7de9a2" exitCode=0 Oct 03 07:09:30 crc kubenswrapper[4810]: I1003 07:09:30.402377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerDied","Data":"06daf077c2ade239727d8c34e27d7d13a0b699f6bf43326782f83f4a4f7de9a2"} Oct 03 07:09:30 crc kubenswrapper[4810]: I1003 07:09:30.644151 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-9qx25" Oct 03 07:09:30 crc kubenswrapper[4810]: I1003 07:09:30.940661 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.034721 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt8rl\" (UniqueName: \"kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl\") pod \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.035256 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities\") pod \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.035313 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content\") pod \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\" (UID: \"39351db9-e4b0-4a6b-869d-ddee8abb7c3d\") " Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.036467 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities" (OuterVolumeSpecName: "utilities") pod "39351db9-e4b0-4a6b-869d-ddee8abb7c3d" (UID: "39351db9-e4b0-4a6b-869d-ddee8abb7c3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.042828 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl" (OuterVolumeSpecName: "kube-api-access-mt8rl") pod "39351db9-e4b0-4a6b-869d-ddee8abb7c3d" (UID: "39351db9-e4b0-4a6b-869d-ddee8abb7c3d"). InnerVolumeSpecName "kube-api-access-mt8rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.093017 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39351db9-e4b0-4a6b-869d-ddee8abb7c3d" (UID: "39351db9-e4b0-4a6b-869d-ddee8abb7c3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.137411 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt8rl\" (UniqueName: \"kubernetes.io/projected/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-kube-api-access-mt8rl\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.137540 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.137619 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39351db9-e4b0-4a6b-869d-ddee8abb7c3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.417972 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f7gkr" event={"ID":"39351db9-e4b0-4a6b-869d-ddee8abb7c3d","Type":"ContainerDied","Data":"b53e21410dacf940e2dd31bbec8f12b80680e2642cc341aa8666ff023ff0dc85"} Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.418193 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f7gkr" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.418603 4810 scope.go:117] "RemoveContainer" containerID="06daf077c2ade239727d8c34e27d7d13a0b699f6bf43326782f83f4a4f7de9a2" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.469467 4810 scope.go:117] "RemoveContainer" containerID="40a6f6a0e7aaaf71bda51fc0b88b8e76290dc4c71a86d921b96262e5441fa339" Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.472935 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.484171 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f7gkr"] Oct 03 07:09:31 crc kubenswrapper[4810]: I1003 07:09:31.502434 4810 scope.go:117] "RemoveContainer" containerID="6a291ebca7bfaaf7f33276f847bfe033130a59d28ea46d9d5baf267f53dc8532" Oct 03 07:09:33 crc kubenswrapper[4810]: I1003 07:09:33.312819 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" path="/var/lib/kubelet/pods/39351db9-e4b0-4a6b-869d-ddee8abb7c3d/volumes" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.600853 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk"] Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601869 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="extract-content" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601885 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="extract-content" Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601916 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="extract-utilities" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601922 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="extract-utilities" Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601935 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="extract-content" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601942 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="extract-content" Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601956 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601963 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601972 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601978 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: E1003 07:09:44.601986 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="extract-utilities" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.601994 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="extract-utilities" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.602082 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="39351db9-e4b0-4a6b-869d-ddee8abb7c3d" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.602099 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="585b8930-ba1b-43ef-a57e-458a63c2ca8c" containerName="registry-server" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.602782 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.605050 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.649064 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk"] Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.776052 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.776390 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.776498 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vct9\" (UniqueName: \"kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.878557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.878702 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.878764 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vct9\" (UniqueName: \"kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.879800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.880523 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.904501 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vct9\" (UniqueName: \"kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:44 crc kubenswrapper[4810]: I1003 07:09:44.931413 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:45 crc kubenswrapper[4810]: I1003 07:09:45.408886 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk"] Oct 03 07:09:45 crc kubenswrapper[4810]: I1003 07:09:45.524664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" event={"ID":"c5b60a1d-b331-43fa-93ae-88e9bf8af024","Type":"ContainerStarted","Data":"9b69e42c638deb2d81614f35993f4e7dd21a253863173a73a087e1325141b3c9"} Oct 03 07:09:46 crc kubenswrapper[4810]: I1003 07:09:46.473728 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-mp8wp" podUID="5468dbc2-8210-4771-b950-cd96757c5788" containerName="console" containerID="cri-o://e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a" gracePeriod=15 Oct 03 07:09:46 crc kubenswrapper[4810]: I1003 07:09:46.534958 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerID="7c6a9f96e1ff4e4304ba6015680c315cbfb9a3db0d7223da81c2407183b7aae4" exitCode=0 Oct 03 07:09:46 crc kubenswrapper[4810]: I1003 07:09:46.535056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" event={"ID":"c5b60a1d-b331-43fa-93ae-88e9bf8af024","Type":"ContainerDied","Data":"7c6a9f96e1ff4e4304ba6015680c315cbfb9a3db0d7223da81c2407183b7aae4"} Oct 03 07:09:46 crc kubenswrapper[4810]: I1003 07:09:46.896925 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mp8wp_5468dbc2-8210-4771-b950-cd96757c5788/console/0.log" Oct 03 07:09:46 crc kubenswrapper[4810]: I1003 07:09:46.897483 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008457 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008524 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008568 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008598 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008695 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhqct\" (UniqueName: \"kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008765 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.008813 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert\") pod \"5468dbc2-8210-4771-b950-cd96757c5788\" (UID: \"5468dbc2-8210-4771-b950-cd96757c5788\") " Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.009252 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config" (OuterVolumeSpecName: "console-config") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.009455 4810 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.010022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca" (OuterVolumeSpecName: "service-ca") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.010091 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.010227 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.016174 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.016214 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct" (OuterVolumeSpecName: "kube-api-access-qhqct") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "kube-api-access-qhqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.017512 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5468dbc2-8210-4771-b950-cd96757c5788" (UID: "5468dbc2-8210-4771-b950-cd96757c5788"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110586 4810 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110629 4810 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110639 4810 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5468dbc2-8210-4771-b950-cd96757c5788-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110648 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhqct\" (UniqueName: \"kubernetes.io/projected/5468dbc2-8210-4771-b950-cd96757c5788-kube-api-access-qhqct\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110659 4810 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.110669 4810 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5468dbc2-8210-4771-b950-cd96757c5788-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.544143 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mp8wp_5468dbc2-8210-4771-b950-cd96757c5788/console/0.log" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.545074 4810 generic.go:334] "Generic (PLEG): container finished" podID="5468dbc2-8210-4771-b950-cd96757c5788" containerID="e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a" exitCode=2 Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.545132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mp8wp" event={"ID":"5468dbc2-8210-4771-b950-cd96757c5788","Type":"ContainerDied","Data":"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a"} Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.545168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mp8wp" event={"ID":"5468dbc2-8210-4771-b950-cd96757c5788","Type":"ContainerDied","Data":"dd3931bdb7ddb2b65487977640235541c03f0370d378428d91248eb1c11d69e2"} Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.545177 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mp8wp" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.545202 4810 scope.go:117] "RemoveContainer" containerID="e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.564994 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.578776 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-mp8wp"] Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.596262 4810 scope.go:117] "RemoveContainer" containerID="e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a" Oct 03 07:09:47 crc kubenswrapper[4810]: E1003 07:09:47.597046 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a\": container with ID starting with e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a not found: ID does not exist" containerID="e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a" Oct 03 07:09:47 crc kubenswrapper[4810]: I1003 07:09:47.597099 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a"} err="failed to get container status \"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a\": rpc error: code = NotFound desc = could not find container \"e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a\": container with ID starting with e87d3e45c55bc9bae5e5742a9ef5e66acffe434acf09a689f2450c42fbcd742a not found: ID does not exist" Oct 03 07:09:48 crc kubenswrapper[4810]: I1003 07:09:48.553167 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerID="dcb05257a7464d239be7d397a62a80df41ad2e5e91112bdff297e3ee4ec5aab9" exitCode=0 Oct 03 07:09:48 crc kubenswrapper[4810]: I1003 07:09:48.553285 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" event={"ID":"c5b60a1d-b331-43fa-93ae-88e9bf8af024","Type":"ContainerDied","Data":"dcb05257a7464d239be7d397a62a80df41ad2e5e91112bdff297e3ee4ec5aab9"} Oct 03 07:09:49 crc kubenswrapper[4810]: I1003 07:09:49.316445 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5468dbc2-8210-4771-b950-cd96757c5788" path="/var/lib/kubelet/pods/5468dbc2-8210-4771-b950-cd96757c5788/volumes" Oct 03 07:09:49 crc kubenswrapper[4810]: I1003 07:09:49.568238 4810 generic.go:334] "Generic (PLEG): container finished" podID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerID="c6f27b247ce48f067ee1ec2a137ec32e168f98c5310ea503e399b403259e8e30" exitCode=0 Oct 03 07:09:49 crc kubenswrapper[4810]: I1003 07:09:49.568332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" event={"ID":"c5b60a1d-b331-43fa-93ae-88e9bf8af024","Type":"ContainerDied","Data":"c6f27b247ce48f067ee1ec2a137ec32e168f98c5310ea503e399b403259e8e30"} Oct 03 07:09:50 crc kubenswrapper[4810]: I1003 07:09:50.928775 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.066125 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vct9\" (UniqueName: \"kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9\") pod \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.066314 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle\") pod \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.066366 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util\") pod \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\" (UID: \"c5b60a1d-b331-43fa-93ae-88e9bf8af024\") " Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.068787 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle" (OuterVolumeSpecName: "bundle") pod "c5b60a1d-b331-43fa-93ae-88e9bf8af024" (UID: "c5b60a1d-b331-43fa-93ae-88e9bf8af024"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.074088 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9" (OuterVolumeSpecName: "kube-api-access-2vct9") pod "c5b60a1d-b331-43fa-93ae-88e9bf8af024" (UID: "c5b60a1d-b331-43fa-93ae-88e9bf8af024"). InnerVolumeSpecName "kube-api-access-2vct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.081556 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util" (OuterVolumeSpecName: "util") pod "c5b60a1d-b331-43fa-93ae-88e9bf8af024" (UID: "c5b60a1d-b331-43fa-93ae-88e9bf8af024"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.168814 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vct9\" (UniqueName: \"kubernetes.io/projected/c5b60a1d-b331-43fa-93ae-88e9bf8af024-kube-api-access-2vct9\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.168955 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.168980 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5b60a1d-b331-43fa-93ae-88e9bf8af024-util\") on node \"crc\" DevicePath \"\"" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.586776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" event={"ID":"c5b60a1d-b331-43fa-93ae-88e9bf8af024","Type":"ContainerDied","Data":"9b69e42c638deb2d81614f35993f4e7dd21a253863173a73a087e1325141b3c9"} Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.586831 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b69e42c638deb2d81614f35993f4e7dd21a253863173a73a087e1325141b3c9" Oct 03 07:09:51 crc kubenswrapper[4810]: I1003 07:09:51.587122 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.042540 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j"] Oct 03 07:10:00 crc kubenswrapper[4810]: E1003 07:10:00.043378 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5468dbc2-8210-4771-b950-cd96757c5788" containerName="console" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043393 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5468dbc2-8210-4771-b950-cd96757c5788" containerName="console" Oct 03 07:10:00 crc kubenswrapper[4810]: E1003 07:10:00.043411 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="util" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043418 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="util" Oct 03 07:10:00 crc kubenswrapper[4810]: E1003 07:10:00.043428 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="pull" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043435 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="pull" Oct 03 07:10:00 crc kubenswrapper[4810]: E1003 07:10:00.043447 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="extract" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043454 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="extract" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043558 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5468dbc2-8210-4771-b950-cd96757c5788" containerName="console" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.043571 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b60a1d-b331-43fa-93ae-88e9bf8af024" containerName="extract" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.044037 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.046469 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.051568 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dc7fw" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.051589 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.051569 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.052347 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.072301 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j"] Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.188526 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-webhook-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.188578 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.188613 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvdq\" (UniqueName: \"kubernetes.io/projected/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-kube-api-access-bsvdq\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.278009 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb"] Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.278638 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.280792 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pdwzh" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.280799 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.281360 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.289799 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-webhook-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.289871 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.290021 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvdq\" (UniqueName: \"kubernetes.io/projected/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-kube-api-access-bsvdq\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.300063 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb"] Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.307067 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-webhook-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.307069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-apiservice-cert\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.326822 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvdq\" (UniqueName: \"kubernetes.io/projected/bacb13e1-0d46-40d5-96a0-e765fdfd6a2c-kube-api-access-bsvdq\") pod \"metallb-operator-controller-manager-7d4655dff5-sw47j\" (UID: \"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c\") " pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.357786 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.392242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-apiservice-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.392349 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-webhook-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.392399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4f79\" (UniqueName: \"kubernetes.io/projected/a569ffff-2a47-4dfc-b245-9453c87786bf-kube-api-access-d4f79\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.495673 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-apiservice-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.496035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-webhook-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.496519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4f79\" (UniqueName: \"kubernetes.io/projected/a569ffff-2a47-4dfc-b245-9453c87786bf-kube-api-access-d4f79\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.500784 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-apiservice-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.503536 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a569ffff-2a47-4dfc-b245-9453c87786bf-webhook-cert\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.515291 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4f79\" (UniqueName: \"kubernetes.io/projected/a569ffff-2a47-4dfc-b245-9453c87786bf-kube-api-access-d4f79\") pod \"metallb-operator-webhook-server-86c9c5f47f-wwgnb\" (UID: \"a569ffff-2a47-4dfc-b245-9453c87786bf\") " pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.607351 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.829339 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j"] Oct 03 07:10:00 crc kubenswrapper[4810]: I1003 07:10:00.874040 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb"] Oct 03 07:10:00 crc kubenswrapper[4810]: W1003 07:10:00.886583 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda569ffff_2a47_4dfc_b245_9453c87786bf.slice/crio-175e6913ca5a7c6d05673f7a8dd7cc91047daa47a40525cb65d93af9a3c8f03f WatchSource:0}: Error finding container 175e6913ca5a7c6d05673f7a8dd7cc91047daa47a40525cb65d93af9a3c8f03f: Status 404 returned error can't find the container with id 175e6913ca5a7c6d05673f7a8dd7cc91047daa47a40525cb65d93af9a3c8f03f Oct 03 07:10:01 crc kubenswrapper[4810]: I1003 07:10:01.643941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" event={"ID":"a569ffff-2a47-4dfc-b245-9453c87786bf","Type":"ContainerStarted","Data":"175e6913ca5a7c6d05673f7a8dd7cc91047daa47a40525cb65d93af9a3c8f03f"} Oct 03 07:10:01 crc kubenswrapper[4810]: I1003 07:10:01.645366 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" event={"ID":"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c","Type":"ContainerStarted","Data":"89f545fa84171cb63b565965239c1cfec06ab1ff0fd4bea22dc3ca7cea946376"} Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.685104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" event={"ID":"a569ffff-2a47-4dfc-b245-9453c87786bf","Type":"ContainerStarted","Data":"7b79ed5657dc2a474654ac9a84e8709ab95d24dbfae11cd9f638e0ccdfe3e0b7"} Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.685741 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.687118 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" event={"ID":"bacb13e1-0d46-40d5-96a0-e765fdfd6a2c","Type":"ContainerStarted","Data":"2966519bd31da85fe6f42591426185fc21bd99d3e452d9b4f491530e2a0c03cb"} Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.687267 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.706322 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" podStartSLOduration=2.040354858 podStartE2EDuration="6.706312534s" podCreationTimestamp="2025-10-03 07:10:00 +0000 UTC" firstStartedPulling="2025-10-03 07:10:00.894206549 +0000 UTC m=+834.321457284" lastFinishedPulling="2025-10-03 07:10:05.560164215 +0000 UTC m=+838.987414960" observedRunningTime="2025-10-03 07:10:06.705199674 +0000 UTC m=+840.132450399" watchObservedRunningTime="2025-10-03 07:10:06.706312534 +0000 UTC m=+840.133563269" Oct 03 07:10:06 crc kubenswrapper[4810]: I1003 07:10:06.727769 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" podStartSLOduration=2.025499907 podStartE2EDuration="6.727750603s" podCreationTimestamp="2025-10-03 07:10:00 +0000 UTC" firstStartedPulling="2025-10-03 07:10:00.844153326 +0000 UTC m=+834.271404061" lastFinishedPulling="2025-10-03 07:10:05.546404022 +0000 UTC m=+838.973654757" observedRunningTime="2025-10-03 07:10:06.724990949 +0000 UTC m=+840.152241714" watchObservedRunningTime="2025-10-03 07:10:06.727750603 +0000 UTC m=+840.155001338" Oct 03 07:10:20 crc kubenswrapper[4810]: I1003 07:10:20.612185 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-86c9c5f47f-wwgnb" Oct 03 07:10:24 crc kubenswrapper[4810]: I1003 07:10:24.995934 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:24 crc kubenswrapper[4810]: I1003 07:10:24.997199 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.009133 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.052661 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.053045 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.053095 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr69p\" (UniqueName: \"kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.153812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.153883 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.153953 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr69p\" (UniqueName: \"kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.154486 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.154512 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.181149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr69p\" (UniqueName: \"kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p\") pod \"community-operators-s28g5\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.325516 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:25 crc kubenswrapper[4810]: I1003 07:10:25.814581 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:25 crc kubenswrapper[4810]: W1003 07:10:25.823463 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0be10e_4fd2_4d66_9707_82ae90262586.slice/crio-0f34baf83f52ce5390210a9b852bd701475dcb2381584b876e4823b175f9c411 WatchSource:0}: Error finding container 0f34baf83f52ce5390210a9b852bd701475dcb2381584b876e4823b175f9c411: Status 404 returned error can't find the container with id 0f34baf83f52ce5390210a9b852bd701475dcb2381584b876e4823b175f9c411 Oct 03 07:10:26 crc kubenswrapper[4810]: I1003 07:10:26.817291 4810 generic.go:334] "Generic (PLEG): container finished" podID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerID="e05c059c97927859c0d397baec4478e5c7a61d309a7667285160636532286f43" exitCode=0 Oct 03 07:10:26 crc kubenswrapper[4810]: I1003 07:10:26.817335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerDied","Data":"e05c059c97927859c0d397baec4478e5c7a61d309a7667285160636532286f43"} Oct 03 07:10:26 crc kubenswrapper[4810]: I1003 07:10:26.817361 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerStarted","Data":"0f34baf83f52ce5390210a9b852bd701475dcb2381584b876e4823b175f9c411"} Oct 03 07:10:27 crc kubenswrapper[4810]: I1003 07:10:27.823161 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerStarted","Data":"cd08486124375b49655f185778875bb7988b2642f65c0f0ce0bdb34456265841"} Oct 03 07:10:28 crc kubenswrapper[4810]: I1003 07:10:28.834551 4810 generic.go:334] "Generic (PLEG): container finished" podID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerID="cd08486124375b49655f185778875bb7988b2642f65c0f0ce0bdb34456265841" exitCode=0 Oct 03 07:10:28 crc kubenswrapper[4810]: I1003 07:10:28.834633 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerDied","Data":"cd08486124375b49655f185778875bb7988b2642f65c0f0ce0bdb34456265841"} Oct 03 07:10:29 crc kubenswrapper[4810]: I1003 07:10:29.843313 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerStarted","Data":"447d4edf84e42cc3372f36f23fb7920cb4be57af682a0f3e337c840a079efaf0"} Oct 03 07:10:29 crc kubenswrapper[4810]: I1003 07:10:29.866160 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s28g5" podStartSLOduration=3.42058586 podStartE2EDuration="5.866138278s" podCreationTimestamp="2025-10-03 07:10:24 +0000 UTC" firstStartedPulling="2025-10-03 07:10:26.821409327 +0000 UTC m=+860.248660062" lastFinishedPulling="2025-10-03 07:10:29.266961755 +0000 UTC m=+862.694212480" observedRunningTime="2025-10-03 07:10:29.861580585 +0000 UTC m=+863.288831330" watchObservedRunningTime="2025-10-03 07:10:29.866138278 +0000 UTC m=+863.293389033" Oct 03 07:10:32 crc kubenswrapper[4810]: I1003 07:10:32.088986 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:10:32 crc kubenswrapper[4810]: I1003 07:10:32.090781 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:10:35 crc kubenswrapper[4810]: I1003 07:10:35.326322 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:35 crc kubenswrapper[4810]: I1003 07:10:35.326421 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:35 crc kubenswrapper[4810]: I1003 07:10:35.370958 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:35 crc kubenswrapper[4810]: I1003 07:10:35.943422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:37 crc kubenswrapper[4810]: I1003 07:10:37.784221 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:37 crc kubenswrapper[4810]: I1003 07:10:37.887292 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s28g5" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="registry-server" containerID="cri-o://447d4edf84e42cc3372f36f23fb7920cb4be57af682a0f3e337c840a079efaf0" gracePeriod=2 Oct 03 07:10:38 crc kubenswrapper[4810]: I1003 07:10:38.897993 4810 generic.go:334] "Generic (PLEG): container finished" podID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerID="447d4edf84e42cc3372f36f23fb7920cb4be57af682a0f3e337c840a079efaf0" exitCode=0 Oct 03 07:10:38 crc kubenswrapper[4810]: I1003 07:10:38.898052 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerDied","Data":"447d4edf84e42cc3372f36f23fb7920cb4be57af682a0f3e337c840a079efaf0"} Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.443239 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.633341 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities\") pod \"4c0be10e-4fd2-4d66-9707-82ae90262586\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.633475 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content\") pod \"4c0be10e-4fd2-4d66-9707-82ae90262586\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.633517 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr69p\" (UniqueName: \"kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p\") pod \"4c0be10e-4fd2-4d66-9707-82ae90262586\" (UID: \"4c0be10e-4fd2-4d66-9707-82ae90262586\") " Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.634287 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities" (OuterVolumeSpecName: "utilities") pod "4c0be10e-4fd2-4d66-9707-82ae90262586" (UID: "4c0be10e-4fd2-4d66-9707-82ae90262586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.638261 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p" (OuterVolumeSpecName: "kube-api-access-kr69p") pod "4c0be10e-4fd2-4d66-9707-82ae90262586" (UID: "4c0be10e-4fd2-4d66-9707-82ae90262586"). InnerVolumeSpecName "kube-api-access-kr69p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.687719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c0be10e-4fd2-4d66-9707-82ae90262586" (UID: "4c0be10e-4fd2-4d66-9707-82ae90262586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.734961 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.734996 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0be10e-4fd2-4d66-9707-82ae90262586-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.735007 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr69p\" (UniqueName: \"kubernetes.io/projected/4c0be10e-4fd2-4d66-9707-82ae90262586-kube-api-access-kr69p\") on node \"crc\" DevicePath \"\"" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.909618 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s28g5" event={"ID":"4c0be10e-4fd2-4d66-9707-82ae90262586","Type":"ContainerDied","Data":"0f34baf83f52ce5390210a9b852bd701475dcb2381584b876e4823b175f9c411"} Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.909681 4810 scope.go:117] "RemoveContainer" containerID="447d4edf84e42cc3372f36f23fb7920cb4be57af682a0f3e337c840a079efaf0" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.910786 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s28g5" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.948958 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.950998 4810 scope.go:117] "RemoveContainer" containerID="cd08486124375b49655f185778875bb7988b2642f65c0f0ce0bdb34456265841" Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.954303 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s28g5"] Oct 03 07:10:39 crc kubenswrapper[4810]: I1003 07:10:39.970481 4810 scope.go:117] "RemoveContainer" containerID="e05c059c97927859c0d397baec4478e5c7a61d309a7667285160636532286f43" Oct 03 07:10:40 crc kubenswrapper[4810]: I1003 07:10:40.360346 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7d4655dff5-sw47j" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.147609 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h7lfx"] Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.147954 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="extract-content" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.147968 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="extract-content" Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.147985 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="registry-server" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.147993 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="registry-server" Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.148009 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="extract-utilities" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.148018 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="extract-utilities" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.148164 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" containerName="registry-server" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.150530 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.151276 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.151939 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.152067 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.152189 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zwb9z" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.152884 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.154777 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.163007 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.236154 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ln5cj"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.236974 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.241220 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.241500 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.241669 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pdbq9" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.241811 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.252676 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-hvl4d"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.253480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clg8p\" (UniqueName: \"kubernetes.io/projected/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-kube-api-access-clg8p\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257346 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-conf\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257370 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-sockets\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257391 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics-certs\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257415 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-reloader\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257447 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj69c\" (UniqueName: \"kubernetes.io/projected/d3603911-89b0-4e7e-bec3-f7496980b97a-kube-api-access-qj69c\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257476 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-startup\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.257492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.260603 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.268367 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-hvl4d"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.310150 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0be10e-4fd2-4d66-9707-82ae90262586" path="/var/lib/kubelet/pods/4c0be10e-4fd2-4d66-9707-82ae90262586/volumes" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359188 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj69c\" (UniqueName: \"kubernetes.io/projected/d3603911-89b0-4e7e-bec3-f7496980b97a-kube-api-access-qj69c\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359269 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-startup\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359284 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metallb-excludel2\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359330 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clg8p\" (UniqueName: \"kubernetes.io/projected/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-kube-api-access-clg8p\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359344 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs797\" (UniqueName: \"kubernetes.io/projected/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-kube-api-access-hs797\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlh25\" (UniqueName: \"kubernetes.io/projected/882eba24-ec61-49fd-8048-32ded4a05a45-kube-api-access-nlh25\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359394 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-conf\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359413 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-metrics-certs\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359432 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-sockets\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359455 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metrics-certs\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359471 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics-certs\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359486 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-cert\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359522 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-reloader\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.359848 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-reloader\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.359957 4810 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.360003 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert podName:d3603911-89b0-4e7e-bec3-f7496980b97a nodeName:}" failed. No retries permitted until 2025-10-03 07:10:41.859987104 +0000 UTC m=+875.287237839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert") pod "frr-k8s-webhook-server-64bf5d555-zv2bz" (UID: "d3603911-89b0-4e7e-bec3-f7496980b97a") : secret "frr-k8s-webhook-server-cert" not found Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.360907 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-startup\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.361078 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.361480 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-sockets\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.361525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-frr-conf\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.372071 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-metrics-certs\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.375780 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj69c\" (UniqueName: \"kubernetes.io/projected/d3603911-89b0-4e7e-bec3-f7496980b97a-kube-api-access-qj69c\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.378707 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clg8p\" (UniqueName: \"kubernetes.io/projected/e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9-kube-api-access-clg8p\") pod \"frr-k8s-h7lfx\" (UID: \"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9\") " pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlh25\" (UniqueName: \"kubernetes.io/projected/882eba24-ec61-49fd-8048-32ded4a05a45-kube-api-access-nlh25\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461202 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-metrics-certs\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461240 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metrics-certs\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461267 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-cert\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metallb-excludel2\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.461387 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs797\" (UniqueName: \"kubernetes.io/projected/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-kube-api-access-hs797\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.461784 4810 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.461837 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist podName:66aee647-ecf6-432b-95a9-6f2bcf7cf9cf nodeName:}" failed. No retries permitted until 2025-10-03 07:10:41.961821838 +0000 UTC m=+875.389072573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist") pod "speaker-ln5cj" (UID: "66aee647-ecf6-432b-95a9-6f2bcf7cf9cf") : secret "metallb-memberlist" not found Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.462635 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metallb-excludel2\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.465157 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-metrics-certs\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.466007 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-metrics-certs\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.475107 4810 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.475170 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.475973 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs797\" (UniqueName: \"kubernetes.io/projected/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-kube-api-access-hs797\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.479216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlh25\" (UniqueName: \"kubernetes.io/projected/882eba24-ec61-49fd-8048-32ded4a05a45-kube-api-access-nlh25\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.485038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/882eba24-ec61-49fd-8048-32ded4a05a45-cert\") pod \"controller-68d546b9d8-hvl4d\" (UID: \"882eba24-ec61-49fd-8048-32ded4a05a45\") " pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.569160 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.760059 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-hvl4d"] Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.866672 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.872832 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d3603911-89b0-4e7e-bec3-f7496980b97a-cert\") pod \"frr-k8s-webhook-server-64bf5d555-zv2bz\" (UID: \"d3603911-89b0-4e7e-bec3-f7496980b97a\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.925105 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"1dec21ccc521244df1d2b6ff8e807902ecd0712aa7be075e696922c383a66f3a"} Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.929294 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-hvl4d" event={"ID":"882eba24-ec61-49fd-8048-32ded4a05a45","Type":"ContainerStarted","Data":"a6adc37707915fdbbec91a71f33213578712762f365ef0e574471b9217ca99f3"} Oct 03 07:10:41 crc kubenswrapper[4810]: I1003 07:10:41.968414 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.968579 4810 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 07:10:41 crc kubenswrapper[4810]: E1003 07:10:41.968654 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist podName:66aee647-ecf6-432b-95a9-6f2bcf7cf9cf nodeName:}" failed. No retries permitted until 2025-10-03 07:10:42.968637774 +0000 UTC m=+876.395888509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist") pod "speaker-ln5cj" (UID: "66aee647-ecf6-432b-95a9-6f2bcf7cf9cf") : secret "metallb-memberlist" not found Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.086141 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.327058 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz"] Oct 03 07:10:42 crc kubenswrapper[4810]: W1003 07:10:42.347193 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3603911_89b0_4e7e_bec3_f7496980b97a.slice/crio-6f9c46d952996626b9993e3608281785612d2267d689baede7b726b10b893de5 WatchSource:0}: Error finding container 6f9c46d952996626b9993e3608281785612d2267d689baede7b726b10b893de5: Status 404 returned error can't find the container with id 6f9c46d952996626b9993e3608281785612d2267d689baede7b726b10b893de5 Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.938473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-hvl4d" event={"ID":"882eba24-ec61-49fd-8048-32ded4a05a45","Type":"ContainerStarted","Data":"55bd379c459eebda29999c419f30d06a09a7377c8162e032ad6f41b77dbb491c"} Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.938937 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-hvl4d" event={"ID":"882eba24-ec61-49fd-8048-32ded4a05a45","Type":"ContainerStarted","Data":"076ca752a874a45a3b96ccdf36258eba06e855e0a4c07236bff5ccfea72053e9"} Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.938967 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.940760 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" event={"ID":"d3603911-89b0-4e7e-bec3-f7496980b97a","Type":"ContainerStarted","Data":"6f9c46d952996626b9993e3608281785612d2267d689baede7b726b10b893de5"} Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.964392 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-hvl4d" podStartSLOduration=1.9638614159999999 podStartE2EDuration="1.963861416s" podCreationTimestamp="2025-10-03 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:10:42.960392042 +0000 UTC m=+876.387642797" watchObservedRunningTime="2025-10-03 07:10:42.963861416 +0000 UTC m=+876.391112151" Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.983393 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:42 crc kubenswrapper[4810]: I1003 07:10:42.990773 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/66aee647-ecf6-432b-95a9-6f2bcf7cf9cf-memberlist\") pod \"speaker-ln5cj\" (UID: \"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf\") " pod="metallb-system/speaker-ln5cj" Oct 03 07:10:43 crc kubenswrapper[4810]: I1003 07:10:43.050047 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ln5cj" Oct 03 07:10:43 crc kubenswrapper[4810]: I1003 07:10:43.951988 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ln5cj" event={"ID":"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf","Type":"ContainerStarted","Data":"026c5e43e0d4741b0a295325a2d96e5bfa1d135d8f1d273b52ff840d2b06e26f"} Oct 03 07:10:43 crc kubenswrapper[4810]: I1003 07:10:43.952267 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ln5cj" event={"ID":"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf","Type":"ContainerStarted","Data":"45f7a9c995e209b693f0852d08b1ca0bce2f6afb6ed001fa14342ee271a57756"} Oct 03 07:10:44 crc kubenswrapper[4810]: I1003 07:10:44.965562 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ln5cj" event={"ID":"66aee647-ecf6-432b-95a9-6f2bcf7cf9cf","Type":"ContainerStarted","Data":"d4f88029bc7d091531715505e60520eca061bb84100da3442a335fbc0d4e3956"} Oct 03 07:10:44 crc kubenswrapper[4810]: I1003 07:10:44.965913 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ln5cj" Oct 03 07:10:47 crc kubenswrapper[4810]: I1003 07:10:47.326875 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ln5cj" podStartSLOduration=6.326836038 podStartE2EDuration="6.326836038s" podCreationTimestamp="2025-10-03 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:10:44.994088551 +0000 UTC m=+878.421339286" watchObservedRunningTime="2025-10-03 07:10:47.326836038 +0000 UTC m=+880.754086773" Oct 03 07:10:51 crc kubenswrapper[4810]: I1003 07:10:51.007322 4810 generic.go:334] "Generic (PLEG): container finished" podID="e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9" containerID="aff84a971e08f543f4e3a3cf4bcf74f370040447f57727118f13fdec5a4b6a50" exitCode=0 Oct 03 07:10:51 crc kubenswrapper[4810]: I1003 07:10:51.007412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerDied","Data":"aff84a971e08f543f4e3a3cf4bcf74f370040447f57727118f13fdec5a4b6a50"} Oct 03 07:10:51 crc kubenswrapper[4810]: I1003 07:10:51.009373 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" event={"ID":"d3603911-89b0-4e7e-bec3-f7496980b97a","Type":"ContainerStarted","Data":"4f31e7a523f938f61ef0267c445095a611ed11b542b9f6a8af5a59343ae9b7b7"} Oct 03 07:10:51 crc kubenswrapper[4810]: I1003 07:10:51.010168 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:10:51 crc kubenswrapper[4810]: I1003 07:10:51.073368 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" podStartSLOduration=2.2923686229999998 podStartE2EDuration="10.073345365s" podCreationTimestamp="2025-10-03 07:10:41 +0000 UTC" firstStartedPulling="2025-10-03 07:10:42.350132098 +0000 UTC m=+875.777382833" lastFinishedPulling="2025-10-03 07:10:50.13110884 +0000 UTC m=+883.558359575" observedRunningTime="2025-10-03 07:10:51.07095112 +0000 UTC m=+884.498201865" watchObservedRunningTime="2025-10-03 07:10:51.073345365 +0000 UTC m=+884.500596100" Oct 03 07:10:52 crc kubenswrapper[4810]: I1003 07:10:52.017588 4810 generic.go:334] "Generic (PLEG): container finished" podID="e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9" containerID="610ff81bdc42a9d3ce2507cfa772f8bfbc2c40349c328200371ab8a3687adc0e" exitCode=0 Oct 03 07:10:52 crc kubenswrapper[4810]: I1003 07:10:52.017650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerDied","Data":"610ff81bdc42a9d3ce2507cfa772f8bfbc2c40349c328200371ab8a3687adc0e"} Oct 03 07:10:53 crc kubenswrapper[4810]: I1003 07:10:53.027377 4810 generic.go:334] "Generic (PLEG): container finished" podID="e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9" containerID="177823ab606b1d94354fbbbacaec71538e026bbc8ae408dbf96894a2f49805a6" exitCode=0 Oct 03 07:10:53 crc kubenswrapper[4810]: I1003 07:10:53.027417 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerDied","Data":"177823ab606b1d94354fbbbacaec71538e026bbc8ae408dbf96894a2f49805a6"} Oct 03 07:10:53 crc kubenswrapper[4810]: I1003 07:10:53.054313 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ln5cj" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.035860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"68dfc9c5045f8ba2a75017fedf391571c078d600407f6cb29a5f33bf775ad486"} Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.036173 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"e5cb4cfcff6660cb9d725f5c2763107afd826c8dc231388b47ca4f5c91a8d8de"} Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.036186 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"80ac98a25802020822242d7607a810f61afc66e00bb360412ee6a6141e864c82"} Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.588361 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l"] Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.590032 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.594082 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.603813 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l"] Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.741768 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.741909 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.741961 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ghdp\" (UniqueName: \"kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.842794 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.842849 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ghdp\" (UniqueName: \"kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.842882 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.843413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.843466 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.861287 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ghdp\" (UniqueName: \"kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:54 crc kubenswrapper[4810]: I1003 07:10:54.904413 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.046286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"e9260212ded6a0efe24d446f8b95e94eff9e01136aee42c9f68aebc165b74f2c"} Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.046657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"668a8e48c413bfbbf79c3ec73f17633e80318e6cbf4abc7bbc919daca303e111"} Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.046674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h7lfx" event={"ID":"e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9","Type":"ContainerStarted","Data":"413a4cccd6f8db63dc4c0e03ef16e87f07b6ef9719f9c5ab6021bf5f282c5a02"} Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.048075 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.072009 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h7lfx" podStartSLOduration=5.7184522730000005 podStartE2EDuration="14.071993166s" podCreationTimestamp="2025-10-03 07:10:41 +0000 UTC" firstStartedPulling="2025-10-03 07:10:41.724495717 +0000 UTC m=+875.151746452" lastFinishedPulling="2025-10-03 07:10:50.07803661 +0000 UTC m=+883.505287345" observedRunningTime="2025-10-03 07:10:55.070630969 +0000 UTC m=+888.497881724" watchObservedRunningTime="2025-10-03 07:10:55.071993166 +0000 UTC m=+888.499243901" Oct 03 07:10:55 crc kubenswrapper[4810]: I1003 07:10:55.094665 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l"] Oct 03 07:10:56 crc kubenswrapper[4810]: I1003 07:10:56.053507 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerStarted","Data":"5e42072e49500280b161c95ec6aacb61e1a08003a0532bffaf2dc5af566caf04"} Oct 03 07:10:56 crc kubenswrapper[4810]: I1003 07:10:56.476404 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:56 crc kubenswrapper[4810]: I1003 07:10:56.549561 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:10:57 crc kubenswrapper[4810]: I1003 07:10:57.060856 4810 generic.go:334] "Generic (PLEG): container finished" podID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerID="c5b96af37d3d140a0397a0c21d1bff1d1384dfd691f2a01f62dd7d99442ada88" exitCode=0 Oct 03 07:10:57 crc kubenswrapper[4810]: I1003 07:10:57.060941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerDied","Data":"c5b96af37d3d140a0397a0c21d1bff1d1384dfd691f2a01f62dd7d99442ada88"} Oct 03 07:11:00 crc kubenswrapper[4810]: I1003 07:11:00.081462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerStarted","Data":"d4b84f47fc6c9c7fda7df8c652d55182085d27e67b562afd5ac20df08af2cb0a"} Oct 03 07:11:01 crc kubenswrapper[4810]: I1003 07:11:01.088496 4810 generic.go:334] "Generic (PLEG): container finished" podID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerID="d4b84f47fc6c9c7fda7df8c652d55182085d27e67b562afd5ac20df08af2cb0a" exitCode=0 Oct 03 07:11:01 crc kubenswrapper[4810]: I1003 07:11:01.088538 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerDied","Data":"d4b84f47fc6c9c7fda7df8c652d55182085d27e67b562afd5ac20df08af2cb0a"} Oct 03 07:11:01 crc kubenswrapper[4810]: I1003 07:11:01.575473 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-hvl4d" Oct 03 07:11:02 crc kubenswrapper[4810]: I1003 07:11:02.088463 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:11:02 crc kubenswrapper[4810]: I1003 07:11:02.088659 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:11:02 crc kubenswrapper[4810]: I1003 07:11:02.093191 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-zv2bz" Oct 03 07:11:02 crc kubenswrapper[4810]: I1003 07:11:02.097953 4810 generic.go:334] "Generic (PLEG): container finished" podID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerID="54b0248c8d59f551ec21dc756fcc8ce0d2200753a9380d51561c36a12c8c153d" exitCode=0 Oct 03 07:11:02 crc kubenswrapper[4810]: I1003 07:11:02.098009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerDied","Data":"54b0248c8d59f551ec21dc756fcc8ce0d2200753a9380d51561c36a12c8c153d"} Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.391396 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.574516 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util\") pod \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.574573 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle\") pod \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.574630 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ghdp\" (UniqueName: \"kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp\") pod \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\" (UID: \"8db4abc8-dfa1-444e-a0bc-5103e7c66782\") " Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.576683 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle" (OuterVolumeSpecName: "bundle") pod "8db4abc8-dfa1-444e-a0bc-5103e7c66782" (UID: "8db4abc8-dfa1-444e-a0bc-5103e7c66782"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.587244 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp" (OuterVolumeSpecName: "kube-api-access-2ghdp") pod "8db4abc8-dfa1-444e-a0bc-5103e7c66782" (UID: "8db4abc8-dfa1-444e-a0bc-5103e7c66782"). InnerVolumeSpecName "kube-api-access-2ghdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.595747 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util" (OuterVolumeSpecName: "util") pod "8db4abc8-dfa1-444e-a0bc-5103e7c66782" (UID: "8db4abc8-dfa1-444e-a0bc-5103e7c66782"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.676757 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-util\") on node \"crc\" DevicePath \"\"" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.676829 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8db4abc8-dfa1-444e-a0bc-5103e7c66782-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:11:03 crc kubenswrapper[4810]: I1003 07:11:03.676875 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ghdp\" (UniqueName: \"kubernetes.io/projected/8db4abc8-dfa1-444e-a0bc-5103e7c66782-kube-api-access-2ghdp\") on node \"crc\" DevicePath \"\"" Oct 03 07:11:04 crc kubenswrapper[4810]: I1003 07:11:04.112669 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" event={"ID":"8db4abc8-dfa1-444e-a0bc-5103e7c66782","Type":"ContainerDied","Data":"5e42072e49500280b161c95ec6aacb61e1a08003a0532bffaf2dc5af566caf04"} Oct 03 07:11:04 crc kubenswrapper[4810]: I1003 07:11:04.112714 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e42072e49500280b161c95ec6aacb61e1a08003a0532bffaf2dc5af566caf04" Oct 03 07:11:04 crc kubenswrapper[4810]: I1003 07:11:04.112756 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.854309 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck"] Oct 03 07:11:07 crc kubenswrapper[4810]: E1003 07:11:07.854865 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="extract" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.854879 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="extract" Oct 03 07:11:07 crc kubenswrapper[4810]: E1003 07:11:07.854915 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="pull" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.854924 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="pull" Oct 03 07:11:07 crc kubenswrapper[4810]: E1003 07:11:07.854942 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="util" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.854949 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="util" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.855096 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db4abc8-dfa1-444e-a0bc-5103e7c66782" containerName="extract" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.855555 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.858723 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-6rsq2" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.859014 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.860426 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 03 07:11:07 crc kubenswrapper[4810]: I1003 07:11:07.881670 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck"] Oct 03 07:11:08 crc kubenswrapper[4810]: I1003 07:11:08.032000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlj2h\" (UniqueName: \"kubernetes.io/projected/94e9adea-df96-46c6-86ac-5422f880b631-kube-api-access-jlj2h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdwck\" (UID: \"94e9adea-df96-46c6-86ac-5422f880b631\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" Oct 03 07:11:08 crc kubenswrapper[4810]: I1003 07:11:08.133225 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlj2h\" (UniqueName: \"kubernetes.io/projected/94e9adea-df96-46c6-86ac-5422f880b631-kube-api-access-jlj2h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdwck\" (UID: \"94e9adea-df96-46c6-86ac-5422f880b631\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" Oct 03 07:11:08 crc kubenswrapper[4810]: I1003 07:11:08.165027 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlj2h\" (UniqueName: \"kubernetes.io/projected/94e9adea-df96-46c6-86ac-5422f880b631-kube-api-access-jlj2h\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdwck\" (UID: \"94e9adea-df96-46c6-86ac-5422f880b631\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" Oct 03 07:11:08 crc kubenswrapper[4810]: I1003 07:11:08.174180 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" Oct 03 07:11:08 crc kubenswrapper[4810]: I1003 07:11:08.723267 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck"] Oct 03 07:11:08 crc kubenswrapper[4810]: W1003 07:11:08.727708 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e9adea_df96_46c6_86ac_5422f880b631.slice/crio-fdb831e269cb448f4f095c775959820614162674c75313b9fa15e73cdf695b37 WatchSource:0}: Error finding container fdb831e269cb448f4f095c775959820614162674c75313b9fa15e73cdf695b37: Status 404 returned error can't find the container with id fdb831e269cb448f4f095c775959820614162674c75313b9fa15e73cdf695b37 Oct 03 07:11:09 crc kubenswrapper[4810]: I1003 07:11:09.148272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" event={"ID":"94e9adea-df96-46c6-86ac-5422f880b631","Type":"ContainerStarted","Data":"fdb831e269cb448f4f095c775959820614162674c75313b9fa15e73cdf695b37"} Oct 03 07:11:11 crc kubenswrapper[4810]: I1003 07:11:11.478043 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h7lfx" Oct 03 07:11:15 crc kubenswrapper[4810]: I1003 07:11:15.192733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" event={"ID":"94e9adea-df96-46c6-86ac-5422f880b631","Type":"ContainerStarted","Data":"82ea83ae92387aa4c09a64f0a1ad9f090f0567a36ca1eacce570c23c963dd65d"} Oct 03 07:11:15 crc kubenswrapper[4810]: I1003 07:11:15.217797 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdwck" podStartSLOduration=2.03374937 podStartE2EDuration="8.217776338s" podCreationTimestamp="2025-10-03 07:11:07 +0000 UTC" firstStartedPulling="2025-10-03 07:11:08.730224741 +0000 UTC m=+902.157475476" lastFinishedPulling="2025-10-03 07:11:14.914251709 +0000 UTC m=+908.341502444" observedRunningTime="2025-10-03 07:11:15.213937113 +0000 UTC m=+908.641187878" watchObservedRunningTime="2025-10-03 07:11:15.217776338 +0000 UTC m=+908.645027093" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.502659 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4lszr"] Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.503782 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.505965 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xf9fs" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.506171 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.506525 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.521818 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4lszr"] Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.577194 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldj9\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-kube-api-access-6ldj9\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.577257 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.678527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldj9\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-kube-api-access-6ldj9\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.678650 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.701661 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-bound-sa-token\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.702276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldj9\" (UniqueName: \"kubernetes.io/projected/eb52d59d-d019-4770-94cf-212c63c37fc6-kube-api-access-6ldj9\") pod \"cert-manager-webhook-d969966f-4lszr\" (UID: \"eb52d59d-d019-4770-94cf-212c63c37fc6\") " pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:18 crc kubenswrapper[4810]: I1003 07:11:18.831708 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:19 crc kubenswrapper[4810]: I1003 07:11:19.309709 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-4lszr"] Oct 03 07:11:19 crc kubenswrapper[4810]: W1003 07:11:19.318079 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb52d59d_d019_4770_94cf_212c63c37fc6.slice/crio-b8cc563fb8426fe7c05540c86a1cec72a6593d452f1c82353b8fbbd7d71f78c3 WatchSource:0}: Error finding container b8cc563fb8426fe7c05540c86a1cec72a6593d452f1c82353b8fbbd7d71f78c3: Status 404 returned error can't find the container with id b8cc563fb8426fe7c05540c86a1cec72a6593d452f1c82353b8fbbd7d71f78c3 Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.147219 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l"] Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.148059 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.150312 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-km4hd" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.159232 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l"] Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.199412 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknr5\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-kube-api-access-vknr5\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.199499 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.223761 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" event={"ID":"eb52d59d-d019-4770-94cf-212c63c37fc6","Type":"ContainerStarted","Data":"b8cc563fb8426fe7c05540c86a1cec72a6593d452f1c82353b8fbbd7d71f78c3"} Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.301405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.301561 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknr5\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-kube-api-access-vknr5\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.320804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.323216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknr5\" (UniqueName: \"kubernetes.io/projected/96ab5737-7fd2-43bf-8f7c-b5b10990c2b5-kube-api-access-vknr5\") pod \"cert-manager-cainjector-7d9f95dbf-4w69l\" (UID: \"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.467728 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" Oct 03 07:11:20 crc kubenswrapper[4810]: I1003 07:11:20.939409 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l"] Oct 03 07:11:20 crc kubenswrapper[4810]: W1003 07:11:20.944507 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ab5737_7fd2_43bf_8f7c_b5b10990c2b5.slice/crio-89643cf2e7a3455dd63ae265edd261e4bd34560f9597d31d42ee1e400008bc3a WatchSource:0}: Error finding container 89643cf2e7a3455dd63ae265edd261e4bd34560f9597d31d42ee1e400008bc3a: Status 404 returned error can't find the container with id 89643cf2e7a3455dd63ae265edd261e4bd34560f9597d31d42ee1e400008bc3a Oct 03 07:11:21 crc kubenswrapper[4810]: I1003 07:11:21.233868 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" event={"ID":"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5","Type":"ContainerStarted","Data":"89643cf2e7a3455dd63ae265edd261e4bd34560f9597d31d42ee1e400008bc3a"} Oct 03 07:11:24 crc kubenswrapper[4810]: I1003 07:11:24.250647 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" event={"ID":"eb52d59d-d019-4770-94cf-212c63c37fc6","Type":"ContainerStarted","Data":"0f87d96e852aa26226691f770a38f84f88eeaef975a659a7ac73ae35807d6170"} Oct 03 07:11:24 crc kubenswrapper[4810]: I1003 07:11:24.251056 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:24 crc kubenswrapper[4810]: I1003 07:11:24.252269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" event={"ID":"96ab5737-7fd2-43bf-8f7c-b5b10990c2b5","Type":"ContainerStarted","Data":"fe4aae2fbc1586d0b441481e548985cca0e2a42905c853670c90475219cf658d"} Oct 03 07:11:24 crc kubenswrapper[4810]: I1003 07:11:24.266857 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" podStartSLOduration=2.097310158 podStartE2EDuration="6.266834868s" podCreationTimestamp="2025-10-03 07:11:18 +0000 UTC" firstStartedPulling="2025-10-03 07:11:19.32034065 +0000 UTC m=+912.747591385" lastFinishedPulling="2025-10-03 07:11:23.48986536 +0000 UTC m=+916.917116095" observedRunningTime="2025-10-03 07:11:24.265031119 +0000 UTC m=+917.692281854" watchObservedRunningTime="2025-10-03 07:11:24.266834868 +0000 UTC m=+917.694085603" Oct 03 07:11:24 crc kubenswrapper[4810]: I1003 07:11:24.282990 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-4w69l" podStartSLOduration=1.7422740559999998 podStartE2EDuration="4.282937405s" podCreationTimestamp="2025-10-03 07:11:20 +0000 UTC" firstStartedPulling="2025-10-03 07:11:20.946403065 +0000 UTC m=+914.373653800" lastFinishedPulling="2025-10-03 07:11:23.487066404 +0000 UTC m=+916.914317149" observedRunningTime="2025-10-03 07:11:24.280455688 +0000 UTC m=+917.707706423" watchObservedRunningTime="2025-10-03 07:11:24.282937405 +0000 UTC m=+917.710188150" Oct 03 07:11:28 crc kubenswrapper[4810]: I1003 07:11:28.834191 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-4lszr" Oct 03 07:11:32 crc kubenswrapper[4810]: I1003 07:11:32.088674 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:11:32 crc kubenswrapper[4810]: I1003 07:11:32.089059 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:11:32 crc kubenswrapper[4810]: I1003 07:11:32.089101 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:11:32 crc kubenswrapper[4810]: I1003 07:11:32.089656 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:11:32 crc kubenswrapper[4810]: I1003 07:11:32.089707 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5" gracePeriod=600 Oct 03 07:11:33 crc kubenswrapper[4810]: I1003 07:11:33.320357 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5" exitCode=0 Oct 03 07:11:33 crc kubenswrapper[4810]: I1003 07:11:33.320421 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5"} Oct 03 07:11:33 crc kubenswrapper[4810]: I1003 07:11:33.321080 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a"} Oct 03 07:11:33 crc kubenswrapper[4810]: I1003 07:11:33.321115 4810 scope.go:117] "RemoveContainer" containerID="796937be4603feeebd1fcf66d6995fd9e028f8dea1fd6b2bd8820d728904d9e3" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.606874 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-l99g8"] Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.609286 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.614118 4810 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-84hjp" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.628545 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-l99g8"] Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.635512 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjwm\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-kube-api-access-2hjwm\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.635587 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.736259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.736370 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjwm\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-kube-api-access-2hjwm\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.762115 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.767585 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjwm\" (UniqueName: \"kubernetes.io/projected/44861968-3c3b-4288-b1d2-0da6e2d74cc4-kube-api-access-2hjwm\") pod \"cert-manager-7d4cc89fcb-l99g8\" (UID: \"44861968-3c3b-4288-b1d2-0da6e2d74cc4\") " pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:37 crc kubenswrapper[4810]: I1003 07:11:37.947066 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" Oct 03 07:11:38 crc kubenswrapper[4810]: I1003 07:11:38.151167 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-l99g8"] Oct 03 07:11:38 crc kubenswrapper[4810]: I1003 07:11:38.358046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" event={"ID":"44861968-3c3b-4288-b1d2-0da6e2d74cc4","Type":"ContainerStarted","Data":"4be0a392d192d8d963a6eda9d6135222d798f2f2681f78a3e6c7a6b46a53861b"} Oct 03 07:11:38 crc kubenswrapper[4810]: I1003 07:11:38.358600 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" event={"ID":"44861968-3c3b-4288-b1d2-0da6e2d74cc4","Type":"ContainerStarted","Data":"fb7bf276c25217c3f46f3faf0b234f635954b6dca5a77cfa07dfe282a4291390"} Oct 03 07:11:38 crc kubenswrapper[4810]: I1003 07:11:38.384960 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-l99g8" podStartSLOduration=1.384943673 podStartE2EDuration="1.384943673s" podCreationTimestamp="2025-10-03 07:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:11:38.380556744 +0000 UTC m=+931.807807479" watchObservedRunningTime="2025-10-03 07:11:38.384943673 +0000 UTC m=+931.812194408" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.168057 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.171875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.181742 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ntjzw" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.182225 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.183701 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.184003 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.221790 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbxp\" (UniqueName: \"kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp\") pod \"openstack-operator-index-fdnnb\" (UID: \"df4aab23-4d62-4625-9cb7-e5c5aba1d953\") " pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.325152 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbxp\" (UniqueName: \"kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp\") pod \"openstack-operator-index-fdnnb\" (UID: \"df4aab23-4d62-4625-9cb7-e5c5aba1d953\") " pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.345099 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbxp\" (UniqueName: \"kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp\") pod \"openstack-operator-index-fdnnb\" (UID: \"df4aab23-4d62-4625-9cb7-e5c5aba1d953\") " pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.494383 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:43 crc kubenswrapper[4810]: I1003 07:11:43.719129 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:43 crc kubenswrapper[4810]: W1003 07:11:43.727011 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf4aab23_4d62_4625_9cb7_e5c5aba1d953.slice/crio-bdb53aa960fea4998c0576248a99979ac6304a772b0dc01b72475be677379561 WatchSource:0}: Error finding container bdb53aa960fea4998c0576248a99979ac6304a772b0dc01b72475be677379561: Status 404 returned error can't find the container with id bdb53aa960fea4998c0576248a99979ac6304a772b0dc01b72475be677379561 Oct 03 07:11:44 crc kubenswrapper[4810]: I1003 07:11:44.408252 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fdnnb" event={"ID":"df4aab23-4d62-4625-9cb7-e5c5aba1d953","Type":"ContainerStarted","Data":"bdb53aa960fea4998c0576248a99979ac6304a772b0dc01b72475be677379561"} Oct 03 07:11:46 crc kubenswrapper[4810]: I1003 07:11:46.519150 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.149729 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kkbw6"] Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.151105 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.155813 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kkbw6"] Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.279736 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n6dr\" (UniqueName: \"kubernetes.io/projected/bd51530d-735f-4124-bb3e-4c3967f64a31-kube-api-access-8n6dr\") pod \"openstack-operator-index-kkbw6\" (UID: \"bd51530d-735f-4124-bb3e-4c3967f64a31\") " pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.382437 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n6dr\" (UniqueName: \"kubernetes.io/projected/bd51530d-735f-4124-bb3e-4c3967f64a31-kube-api-access-8n6dr\") pod \"openstack-operator-index-kkbw6\" (UID: \"bd51530d-735f-4124-bb3e-4c3967f64a31\") " pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.414788 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n6dr\" (UniqueName: \"kubernetes.io/projected/bd51530d-735f-4124-bb3e-4c3967f64a31-kube-api-access-8n6dr\") pod \"openstack-operator-index-kkbw6\" (UID: \"bd51530d-735f-4124-bb3e-4c3967f64a31\") " pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.466762 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:47 crc kubenswrapper[4810]: I1003 07:11:47.878572 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kkbw6"] Oct 03 07:11:47 crc kubenswrapper[4810]: W1003 07:11:47.884065 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd51530d_735f_4124_bb3e_4c3967f64a31.slice/crio-82ce14ce23ad9db287913501c0233bec8937dd660b911caa645183655d97be14 WatchSource:0}: Error finding container 82ce14ce23ad9db287913501c0233bec8937dd660b911caa645183655d97be14: Status 404 returned error can't find the container with id 82ce14ce23ad9db287913501c0233bec8937dd660b911caa645183655d97be14 Oct 03 07:11:48 crc kubenswrapper[4810]: I1003 07:11:48.445411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kkbw6" event={"ID":"bd51530d-735f-4124-bb3e-4c3967f64a31","Type":"ContainerStarted","Data":"82ce14ce23ad9db287913501c0233bec8937dd660b911caa645183655d97be14"} Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.497549 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fdnnb" event={"ID":"df4aab23-4d62-4625-9cb7-e5c5aba1d953","Type":"ContainerStarted","Data":"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677"} Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.497735 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-fdnnb" podUID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" containerName="registry-server" containerID="cri-o://7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677" gracePeriod=2 Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.499886 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kkbw6" event={"ID":"bd51530d-735f-4124-bb3e-4c3967f64a31","Type":"ContainerStarted","Data":"62a8ce962fda6e5f1d052340807b567fa977ab275bac4aa37310a2c4e601afd2"} Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.539874 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kkbw6" podStartSLOduration=1.8706855789999999 podStartE2EDuration="7.539813099s" podCreationTimestamp="2025-10-03 07:11:47 +0000 UTC" firstStartedPulling="2025-10-03 07:11:47.886239909 +0000 UTC m=+941.313490654" lastFinishedPulling="2025-10-03 07:11:53.555367429 +0000 UTC m=+946.982618174" observedRunningTime="2025-10-03 07:11:54.538178575 +0000 UTC m=+947.965429310" watchObservedRunningTime="2025-10-03 07:11:54.539813099 +0000 UTC m=+947.967063844" Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.541661 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fdnnb" podStartSLOduration=1.7187107240000001 podStartE2EDuration="11.541650408s" podCreationTimestamp="2025-10-03 07:11:43 +0000 UTC" firstStartedPulling="2025-10-03 07:11:43.729936617 +0000 UTC m=+937.157187382" lastFinishedPulling="2025-10-03 07:11:53.552876321 +0000 UTC m=+946.980127066" observedRunningTime="2025-10-03 07:11:54.520162465 +0000 UTC m=+947.947413220" watchObservedRunningTime="2025-10-03 07:11:54.541650408 +0000 UTC m=+947.968901173" Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.865977 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.993878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbxp\" (UniqueName: \"kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp\") pod \"df4aab23-4d62-4625-9cb7-e5c5aba1d953\" (UID: \"df4aab23-4d62-4625-9cb7-e5c5aba1d953\") " Oct 03 07:11:54 crc kubenswrapper[4810]: I1003 07:11:54.999649 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp" (OuterVolumeSpecName: "kube-api-access-xdbxp") pod "df4aab23-4d62-4625-9cb7-e5c5aba1d953" (UID: "df4aab23-4d62-4625-9cb7-e5c5aba1d953"). InnerVolumeSpecName "kube-api-access-xdbxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.095566 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbxp\" (UniqueName: \"kubernetes.io/projected/df4aab23-4d62-4625-9cb7-e5c5aba1d953-kube-api-access-xdbxp\") on node \"crc\" DevicePath \"\"" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.511996 4810 generic.go:334] "Generic (PLEG): container finished" podID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" containerID="7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677" exitCode=0 Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.512155 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fdnnb" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.512185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fdnnb" event={"ID":"df4aab23-4d62-4625-9cb7-e5c5aba1d953","Type":"ContainerDied","Data":"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677"} Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.512263 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fdnnb" event={"ID":"df4aab23-4d62-4625-9cb7-e5c5aba1d953","Type":"ContainerDied","Data":"bdb53aa960fea4998c0576248a99979ac6304a772b0dc01b72475be677379561"} Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.512306 4810 scope.go:117] "RemoveContainer" containerID="7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.538633 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.541481 4810 scope.go:117] "RemoveContainer" containerID="7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677" Oct 03 07:11:55 crc kubenswrapper[4810]: E1003 07:11:55.542039 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677\": container with ID starting with 7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677 not found: ID does not exist" containerID="7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.542083 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677"} err="failed to get container status \"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677\": rpc error: code = NotFound desc = could not find container \"7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677\": container with ID starting with 7e2bc14cd905b2b4e366fe3fd44e3bee0b4d3b1415859878357374998e268677 not found: ID does not exist" Oct 03 07:11:55 crc kubenswrapper[4810]: I1003 07:11:55.543184 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-fdnnb"] Oct 03 07:11:57 crc kubenswrapper[4810]: I1003 07:11:57.320416 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" path="/var/lib/kubelet/pods/df4aab23-4d62-4625-9cb7-e5c5aba1d953/volumes" Oct 03 07:11:57 crc kubenswrapper[4810]: I1003 07:11:57.467825 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:57 crc kubenswrapper[4810]: I1003 07:11:57.467948 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:57 crc kubenswrapper[4810]: I1003 07:11:57.492879 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:11:58 crc kubenswrapper[4810]: I1003 07:11:58.573423 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kkbw6" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.672229 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt"] Oct 03 07:12:04 crc kubenswrapper[4810]: E1003 07:12:04.672696 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" containerName="registry-server" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.672708 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" containerName="registry-server" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.672832 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4aab23-4d62-4625-9cb7-e5c5aba1d953" containerName="registry-server" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.673617 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.675818 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tsb5n" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.685123 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt"] Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.730297 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2t69\" (UniqueName: \"kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.730351 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.730386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.831270 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2t69\" (UniqueName: \"kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.831322 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.831352 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.831964 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.832519 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.852606 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2t69\" (UniqueName: \"kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69\") pod \"00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:04 crc kubenswrapper[4810]: I1003 07:12:04.993427 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:05 crc kubenswrapper[4810]: I1003 07:12:05.470880 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt"] Oct 03 07:12:05 crc kubenswrapper[4810]: I1003 07:12:05.590800 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerStarted","Data":"07aa61d9e32be353e9a36bbcf254da2dc05cfe2cd9522cc8650a90ecdf0e5317"} Oct 03 07:12:06 crc kubenswrapper[4810]: I1003 07:12:06.613714 4810 generic.go:334] "Generic (PLEG): container finished" podID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerID="7168d5120feb68fdbb7678b05a41cc334f7c75e3999dfead70bf63bcd562b939" exitCode=0 Oct 03 07:12:06 crc kubenswrapper[4810]: I1003 07:12:06.613842 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerDied","Data":"7168d5120feb68fdbb7678b05a41cc334f7c75e3999dfead70bf63bcd562b939"} Oct 03 07:12:09 crc kubenswrapper[4810]: I1003 07:12:09.637381 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerStarted","Data":"451785f202f3972044922807b360a099eea2fef706ed7ac6303b48db9bfb087a"} Oct 03 07:12:10 crc kubenswrapper[4810]: I1003 07:12:10.645830 4810 generic.go:334] "Generic (PLEG): container finished" podID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerID="451785f202f3972044922807b360a099eea2fef706ed7ac6303b48db9bfb087a" exitCode=0 Oct 03 07:12:10 crc kubenswrapper[4810]: I1003 07:12:10.645987 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerDied","Data":"451785f202f3972044922807b360a099eea2fef706ed7ac6303b48db9bfb087a"} Oct 03 07:12:11 crc kubenswrapper[4810]: I1003 07:12:11.655877 4810 generic.go:334] "Generic (PLEG): container finished" podID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerID="20df7ddb0d055c67d24cb0e955251a1b23965375f211022fb8ddbc5c3f88e518" exitCode=0 Oct 03 07:12:11 crc kubenswrapper[4810]: I1003 07:12:11.655967 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerDied","Data":"20df7ddb0d055c67d24cb0e955251a1b23965375f211022fb8ddbc5c3f88e518"} Oct 03 07:12:12 crc kubenswrapper[4810]: I1003 07:12:12.944263 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.055847 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util\") pod \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.055961 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle\") pod \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.056026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2t69\" (UniqueName: \"kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69\") pod \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\" (UID: \"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c\") " Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.057050 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle" (OuterVolumeSpecName: "bundle") pod "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" (UID: "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.062267 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69" (OuterVolumeSpecName: "kube-api-access-j2t69") pod "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" (UID: "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c"). InnerVolumeSpecName "kube-api-access-j2t69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.067080 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util" (OuterVolumeSpecName: "util") pod "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" (UID: "b9f7779b-bb8c-4383-a9fb-3bfe5c38784c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.157938 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.158009 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2t69\" (UniqueName: \"kubernetes.io/projected/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-kube-api-access-j2t69\") on node \"crc\" DevicePath \"\"" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.158040 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b9f7779b-bb8c-4383-a9fb-3bfe5c38784c-util\") on node \"crc\" DevicePath \"\"" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.676609 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" event={"ID":"b9f7779b-bb8c-4383-a9fb-3bfe5c38784c","Type":"ContainerDied","Data":"07aa61d9e32be353e9a36bbcf254da2dc05cfe2cd9522cc8650a90ecdf0e5317"} Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.676647 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07aa61d9e32be353e9a36bbcf254da2dc05cfe2cd9522cc8650a90ecdf0e5317" Oct 03 07:12:13 crc kubenswrapper[4810]: I1003 07:12:13.676690 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.555093 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5"] Oct 03 07:12:18 crc kubenswrapper[4810]: E1003 07:12:18.555723 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="pull" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.555737 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="pull" Oct 03 07:12:18 crc kubenswrapper[4810]: E1003 07:12:18.555756 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="util" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.555763 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="util" Oct 03 07:12:18 crc kubenswrapper[4810]: E1003 07:12:18.555774 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="extract" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.555780 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="extract" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.555884 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f7779b-bb8c-4383-a9fb-3bfe5c38784c" containerName="extract" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.556487 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.561430 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-tlk4n" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.585553 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5"] Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.735064 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vq5c\" (UniqueName: \"kubernetes.io/projected/6526f954-e830-42ea-b0a1-6121ff0e32c1-kube-api-access-8vq5c\") pod \"openstack-operator-controller-operator-86f8d7b75f-wl9x5\" (UID: \"6526f954-e830-42ea-b0a1-6121ff0e32c1\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.836135 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vq5c\" (UniqueName: \"kubernetes.io/projected/6526f954-e830-42ea-b0a1-6121ff0e32c1-kube-api-access-8vq5c\") pod \"openstack-operator-controller-operator-86f8d7b75f-wl9x5\" (UID: \"6526f954-e830-42ea-b0a1-6121ff0e32c1\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.857516 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vq5c\" (UniqueName: \"kubernetes.io/projected/6526f954-e830-42ea-b0a1-6121ff0e32c1-kube-api-access-8vq5c\") pod \"openstack-operator-controller-operator-86f8d7b75f-wl9x5\" (UID: \"6526f954-e830-42ea-b0a1-6121ff0e32c1\") " pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:18 crc kubenswrapper[4810]: I1003 07:12:18.876803 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:19 crc kubenswrapper[4810]: I1003 07:12:19.336379 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5"] Oct 03 07:12:19 crc kubenswrapper[4810]: I1003 07:12:19.718440 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" event={"ID":"6526f954-e830-42ea-b0a1-6121ff0e32c1","Type":"ContainerStarted","Data":"cf6080d3308fcf5b25182302461e518af0d5329400c2229a59930f710009ea00"} Oct 03 07:12:23 crc kubenswrapper[4810]: I1003 07:12:23.751468 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" event={"ID":"6526f954-e830-42ea-b0a1-6121ff0e32c1","Type":"ContainerStarted","Data":"e8bc4e947e4bb80e527534926128452251a68ba20f441e38e9507741e89a58e1"} Oct 03 07:12:29 crc kubenswrapper[4810]: I1003 07:12:29.813449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" event={"ID":"6526f954-e830-42ea-b0a1-6121ff0e32c1","Type":"ContainerStarted","Data":"99df977339f261db7546f87cbe948ec30af2ede882ddba490ab8e01b82ba46ca"} Oct 03 07:12:29 crc kubenswrapper[4810]: I1003 07:12:29.814108 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:29 crc kubenswrapper[4810]: I1003 07:12:29.817708 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" Oct 03 07:12:29 crc kubenswrapper[4810]: I1003 07:12:29.848838 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-86f8d7b75f-wl9x5" podStartSLOduration=2.625186019 podStartE2EDuration="11.848820428s" podCreationTimestamp="2025-10-03 07:12:18 +0000 UTC" firstStartedPulling="2025-10-03 07:12:19.342945249 +0000 UTC m=+972.770195984" lastFinishedPulling="2025-10-03 07:12:28.566579658 +0000 UTC m=+981.993830393" observedRunningTime="2025-10-03 07:12:29.847199934 +0000 UTC m=+983.274450709" watchObservedRunningTime="2025-10-03 07:12:29.848820428 +0000 UTC m=+983.276071173" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.864079 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.865849 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.867606 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-v7s7h" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.869025 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.870189 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.873281 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-s4r55" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.878876 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.896739 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.905254 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.906254 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.908731 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xbt5d" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.921556 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.951337 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.952502 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.954869 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7kzdq" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.969991 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl"] Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.971373 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.974704 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lr5j5" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.975078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4957d\" (UniqueName: \"kubernetes.io/projected/4686e0fc-c208-4a38-a7ae-33adc6123d0d-kube-api-access-4957d\") pod \"barbican-operator-controller-manager-6d6d64fdcf-5d7tm\" (UID: \"4686e0fc-c208-4a38-a7ae-33adc6123d0d\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.975123 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckfhv\" (UniqueName: \"kubernetes.io/projected/701c6350-6581-453d-abc4-728bff24f1a5-kube-api-access-ckfhv\") pod \"cinder-operator-controller-manager-8686fd99f7-jlqcq\" (UID: \"701c6350-6581-453d-abc4-728bff24f1a5\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:03 crc kubenswrapper[4810]: I1003 07:13:03.975148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5wlf\" (UniqueName: \"kubernetes.io/projected/911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d-kube-api-access-f5wlf\") pod \"designate-operator-controller-manager-58d86cd59d-nthwx\" (UID: \"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.001228 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.014110 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.015745 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.022382 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.022567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xncpt" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.027699 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.029749 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.032718 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zvndp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.032923 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.037068 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.044007 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.066978 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.068228 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.071186 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lgflr" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxrh\" (UniqueName: \"kubernetes.io/projected/84c853db-41b6-4013-b6a6-b9f6c3fa74e3-kube-api-access-8kxrh\") pod \"horizon-operator-controller-manager-586b66cf4f-qxv5t\" (UID: \"84c853db-41b6-4013-b6a6-b9f6c3fa74e3\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082069 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wslml\" (UniqueName: \"kubernetes.io/projected/e910dcfe-cedc-4a55-ad55-39adb2422c48-kube-api-access-wslml\") pod \"glance-operator-controller-manager-d785ddfd5-kln57\" (UID: \"e910dcfe-cedc-4a55-ad55-39adb2422c48\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4957d\" (UniqueName: \"kubernetes.io/projected/4686e0fc-c208-4a38-a7ae-33adc6123d0d-kube-api-access-4957d\") pod \"barbican-operator-controller-manager-6d6d64fdcf-5d7tm\" (UID: \"4686e0fc-c208-4a38-a7ae-33adc6123d0d\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082118 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckfhv\" (UniqueName: \"kubernetes.io/projected/701c6350-6581-453d-abc4-728bff24f1a5-kube-api-access-ckfhv\") pod \"cinder-operator-controller-manager-8686fd99f7-jlqcq\" (UID: \"701c6350-6581-453d-abc4-728bff24f1a5\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5wlf\" (UniqueName: \"kubernetes.io/projected/911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d-kube-api-access-f5wlf\") pod \"designate-operator-controller-manager-58d86cd59d-nthwx\" (UID: \"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.082434 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btkw\" (UniqueName: \"kubernetes.io/projected/d812ca0a-067a-4c91-a460-d340ef72d051-kube-api-access-9btkw\") pod \"heat-operator-controller-manager-5ffbdb7ddf-nf2kl\" (UID: \"d812ca0a-067a-4c91-a460-d340ef72d051\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.086809 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.088040 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.097155 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xhwdx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.115728 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.130988 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.134770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckfhv\" (UniqueName: \"kubernetes.io/projected/701c6350-6581-453d-abc4-728bff24f1a5-kube-api-access-ckfhv\") pod \"cinder-operator-controller-manager-8686fd99f7-jlqcq\" (UID: \"701c6350-6581-453d-abc4-728bff24f1a5\") " pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.134842 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.135837 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.139027 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-kngq7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.160201 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4957d\" (UniqueName: \"kubernetes.io/projected/4686e0fc-c208-4a38-a7ae-33adc6123d0d-kube-api-access-4957d\") pod \"barbican-operator-controller-manager-6d6d64fdcf-5d7tm\" (UID: \"4686e0fc-c208-4a38-a7ae-33adc6123d0d\") " pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.160601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5wlf\" (UniqueName: \"kubernetes.io/projected/911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d-kube-api-access-f5wlf\") pod \"designate-operator-controller-manager-58d86cd59d-nthwx\" (UID: \"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d\") " pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.163597 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.164940 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.179127 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183577 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkhdw\" (UniqueName: \"kubernetes.io/projected/21a68e29-471e-428d-9491-ae4b33a01e8a-kube-api-access-dkhdw\") pod \"keystone-operator-controller-manager-6c9969c6c6-zvpws\" (UID: \"21a68e29-471e-428d-9491-ae4b33a01e8a\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183635 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btkw\" (UniqueName: \"kubernetes.io/projected/d812ca0a-067a-4c91-a460-d340ef72d051-kube-api-access-9btkw\") pod \"heat-operator-controller-manager-5ffbdb7ddf-nf2kl\" (UID: \"d812ca0a-067a-4c91-a460-d340ef72d051\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183699 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9v7l\" (UniqueName: \"kubernetes.io/projected/7ed3bd35-9f00-40d3-9ed0-a111c4131b11-kube-api-access-l9v7l\") pod \"ironic-operator-controller-manager-59b5fc9845-hv9vj\" (UID: \"7ed3bd35-9f00-40d3-9ed0-a111c4131b11\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183748 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183773 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jtf\" (UniqueName: \"kubernetes.io/projected/42afdadf-08ba-4196-a815-a4fc0acf2181-kube-api-access-t2jtf\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183790 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxrh\" (UniqueName: \"kubernetes.io/projected/84c853db-41b6-4013-b6a6-b9f6c3fa74e3-kube-api-access-8kxrh\") pod \"horizon-operator-controller-manager-586b66cf4f-qxv5t\" (UID: \"84c853db-41b6-4013-b6a6-b9f6c3fa74e3\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.183829 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wslml\" (UniqueName: \"kubernetes.io/projected/e910dcfe-cedc-4a55-ad55-39adb2422c48-kube-api-access-wslml\") pod \"glance-operator-controller-manager-d785ddfd5-kln57\" (UID: \"e910dcfe-cedc-4a55-ad55-39adb2422c48\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.208150 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-h5m49" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.211121 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.212099 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.225502 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.228625 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxrh\" (UniqueName: \"kubernetes.io/projected/84c853db-41b6-4013-b6a6-b9f6c3fa74e3-kube-api-access-8kxrh\") pod \"horizon-operator-controller-manager-586b66cf4f-qxv5t\" (UID: \"84c853db-41b6-4013-b6a6-b9f6c3fa74e3\") " pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.233374 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.317155 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.325973 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6r7\" (UniqueName: \"kubernetes.io/projected/6c644f74-4957-40ad-8286-b172b802a323-kube-api-access-rf6r7\") pod \"manila-operator-controller-manager-66fdd975d9-47p2c\" (UID: \"6c644f74-4957-40ad-8286-b172b802a323\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326237 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkhdw\" (UniqueName: \"kubernetes.io/projected/21a68e29-471e-428d-9491-ae4b33a01e8a-kube-api-access-dkhdw\") pod \"keystone-operator-controller-manager-6c9969c6c6-zvpws\" (UID: \"21a68e29-471e-428d-9491-ae4b33a01e8a\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326360 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9v7l\" (UniqueName: \"kubernetes.io/projected/7ed3bd35-9f00-40d3-9ed0-a111c4131b11-kube-api-access-l9v7l\") pod \"ironic-operator-controller-manager-59b5fc9845-hv9vj\" (UID: \"7ed3bd35-9f00-40d3-9ed0-a111c4131b11\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326832 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pss\" (UniqueName: \"kubernetes.io/projected/9aeddbe0-0fe7-451c-afaf-bd0074505142-kube-api-access-52pss\") pod \"mariadb-operator-controller-manager-696ff4bcdd-msm2p\" (UID: \"9aeddbe0-0fe7-451c-afaf-bd0074505142\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jtf\" (UniqueName: \"kubernetes.io/projected/42afdadf-08ba-4196-a815-a4fc0acf2181-kube-api-access-t2jtf\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.326151 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wslml\" (UniqueName: \"kubernetes.io/projected/e910dcfe-cedc-4a55-ad55-39adb2422c48-kube-api-access-wslml\") pod \"glance-operator-controller-manager-d785ddfd5-kln57\" (UID: \"e910dcfe-cedc-4a55-ad55-39adb2422c48\") " pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.330109 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.330278 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert podName:42afdadf-08ba-4196-a815-a4fc0acf2181 nodeName:}" failed. No retries permitted until 2025-10-03 07:13:04.83026035 +0000 UTC m=+1018.257511085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert") pod "infra-operator-controller-manager-7c9978f67-wqn2b" (UID: "42afdadf-08ba-4196-a815-a4fc0acf2181") : secret "infra-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.335243 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-4tqc2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.336207 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.356965 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jtf\" (UniqueName: \"kubernetes.io/projected/42afdadf-08ba-4196-a815-a4fc0acf2181-kube-api-access-t2jtf\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.361036 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.401129 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.364705 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkhdw\" (UniqueName: \"kubernetes.io/projected/21a68e29-471e-428d-9491-ae4b33a01e8a-kube-api-access-dkhdw\") pod \"keystone-operator-controller-manager-6c9969c6c6-zvpws\" (UID: \"21a68e29-471e-428d-9491-ae4b33a01e8a\") " pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.404050 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.391745 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btkw\" (UniqueName: \"kubernetes.io/projected/d812ca0a-067a-4c91-a460-d340ef72d051-kube-api-access-9btkw\") pod \"heat-operator-controller-manager-5ffbdb7ddf-nf2kl\" (UID: \"d812ca0a-067a-4c91-a460-d340ef72d051\") " pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.406163 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.406390 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.409567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bqfr6" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.432393 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.434974 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.436104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6r7\" (UniqueName: \"kubernetes.io/projected/6c644f74-4957-40ad-8286-b172b802a323-kube-api-access-rf6r7\") pod \"manila-operator-controller-manager-66fdd975d9-47p2c\" (UID: \"6c644f74-4957-40ad-8286-b172b802a323\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.436184 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbb85\" (UniqueName: \"kubernetes.io/projected/b1578049-8763-4c26-b149-8497d94da92e-kube-api-access-kbb85\") pod \"octavia-operator-controller-manager-b4444585c-9qkqx\" (UID: \"b1578049-8763-4c26-b149-8497d94da92e\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.436257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pss\" (UniqueName: \"kubernetes.io/projected/9aeddbe0-0fe7-451c-afaf-bd0074505142-kube-api-access-52pss\") pod \"mariadb-operator-controller-manager-696ff4bcdd-msm2p\" (UID: \"9aeddbe0-0fe7-451c-afaf-bd0074505142\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.436294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pdt\" (UniqueName: \"kubernetes.io/projected/5fa60d72-b0e4-4a27-b2af-d228cd9db5da-kube-api-access-b7pdt\") pod \"nova-operator-controller-manager-5b45478b88-cj2zz\" (UID: \"5fa60d72-b0e4-4a27-b2af-d228cd9db5da\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.436320 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjr69\" (UniqueName: \"kubernetes.io/projected/6ef1249c-27f8-4cc0-8134-690b6b8773d1-kube-api-access-pjr69\") pod \"neutron-operator-controller-manager-549fb68678-6qbx2\" (UID: \"6ef1249c-27f8-4cc0-8134-690b6b8773d1\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.438104 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-5k74v" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.438276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9v7l\" (UniqueName: \"kubernetes.io/projected/7ed3bd35-9f00-40d3-9ed0-a111c4131b11-kube-api-access-l9v7l\") pod \"ironic-operator-controller-manager-59b5fc9845-hv9vj\" (UID: \"7ed3bd35-9f00-40d3-9ed0-a111c4131b11\") " pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.438667 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.443364 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.467610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pss\" (UniqueName: \"kubernetes.io/projected/9aeddbe0-0fe7-451c-afaf-bd0074505142-kube-api-access-52pss\") pod \"mariadb-operator-controller-manager-696ff4bcdd-msm2p\" (UID: \"9aeddbe0-0fe7-451c-afaf-bd0074505142\") " pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.495632 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.498840 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.502034 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-zv72j" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.502219 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.505118 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.507563 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.514312 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6r7\" (UniqueName: \"kubernetes.io/projected/6c644f74-4957-40ad-8286-b172b802a323-kube-api-access-rf6r7\") pod \"manila-operator-controller-manager-66fdd975d9-47p2c\" (UID: \"6c644f74-4957-40ad-8286-b172b802a323\") " pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.517131 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.518371 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.519429 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-n7qhk" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.531052 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.533105 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.533237 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.537000 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.537678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pdt\" (UniqueName: \"kubernetes.io/projected/5fa60d72-b0e4-4a27-b2af-d228cd9db5da-kube-api-access-b7pdt\") pod \"nova-operator-controller-manager-5b45478b88-cj2zz\" (UID: \"5fa60d72-b0e4-4a27-b2af-d228cd9db5da\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.537731 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjr69\" (UniqueName: \"kubernetes.io/projected/6ef1249c-27f8-4cc0-8134-690b6b8773d1-kube-api-access-pjr69\") pod \"neutron-operator-controller-manager-549fb68678-6qbx2\" (UID: \"6ef1249c-27f8-4cc0-8134-690b6b8773d1\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.539098 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbb85\" (UniqueName: \"kubernetes.io/projected/b1578049-8763-4c26-b149-8497d94da92e-kube-api-access-kbb85\") pod \"octavia-operator-controller-manager-b4444585c-9qkqx\" (UID: \"b1578049-8763-4c26-b149-8497d94da92e\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.541460 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-fhqlm" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.542342 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vs8jm" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.548572 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.548614 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.549943 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.552755 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jss5q" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.559808 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.572167 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.582107 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.583210 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.583292 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.589097 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.592980 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sqtbd" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.603138 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.625412 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.630574 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjr69\" (UniqueName: \"kubernetes.io/projected/6ef1249c-27f8-4cc0-8134-690b6b8773d1-kube-api-access-pjr69\") pod \"neutron-operator-controller-manager-549fb68678-6qbx2\" (UID: \"6ef1249c-27f8-4cc0-8134-690b6b8773d1\") " pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.630580 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pdt\" (UniqueName: \"kubernetes.io/projected/5fa60d72-b0e4-4a27-b2af-d228cd9db5da-kube-api-access-b7pdt\") pod \"nova-operator-controller-manager-5b45478b88-cj2zz\" (UID: \"5fa60d72-b0e4-4a27-b2af-d228cd9db5da\") " pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.630836 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbb85\" (UniqueName: \"kubernetes.io/projected/b1578049-8763-4c26-b149-8497d94da92e-kube-api-access-kbb85\") pod \"octavia-operator-controller-manager-b4444585c-9qkqx\" (UID: \"b1578049-8763-4c26-b149-8497d94da92e\") " pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.642437 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9cw6\" (UniqueName: \"kubernetes.io/projected/368ec6b5-ba12-467e-ab4c-d46a83c31483-kube-api-access-g9cw6\") pod \"telemetry-operator-controller-manager-5ffb97cddf-vf8h7\" (UID: \"368ec6b5-ba12-467e-ab4c-d46a83c31483\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.642516 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.642550 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42ts\" (UniqueName: \"kubernetes.io/projected/aaf10555-293e-4c2b-baea-ffcdd4eeb046-kube-api-access-f42ts\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.642596 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9nw6\" (UniqueName: \"kubernetes.io/projected/2514440b-b998-4078-834e-642c3bcae80f-kube-api-access-r9nw6\") pod \"ovn-operator-controller-manager-855d7949fc-gnvp7\" (UID: \"2514440b-b998-4078-834e-642c3bcae80f\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.642632 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqccz\" (UniqueName: \"kubernetes.io/projected/b842fd5e-6125-40cf-b729-947e54309c87-kube-api-access-qqccz\") pod \"placement-operator-controller-manager-ccbfcb8c-tnrxp\" (UID: \"b842fd5e-6125-40cf-b729-947e54309c87\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.650860 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.651974 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.670910 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.678660 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-bpf5z" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.698218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.722586 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.737952 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.739589 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746211 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9nw6\" (UniqueName: \"kubernetes.io/projected/2514440b-b998-4078-834e-642c3bcae80f-kube-api-access-r9nw6\") pod \"ovn-operator-controller-manager-855d7949fc-gnvp7\" (UID: \"2514440b-b998-4078-834e-642c3bcae80f\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746270 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqccz\" (UniqueName: \"kubernetes.io/projected/b842fd5e-6125-40cf-b729-947e54309c87-kube-api-access-qqccz\") pod \"placement-operator-controller-manager-ccbfcb8c-tnrxp\" (UID: \"b842fd5e-6125-40cf-b729-947e54309c87\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746354 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgbx\" (UniqueName: \"kubernetes.io/projected/9ab8a530-fefb-477b-85f7-6f716f684292-kube-api-access-gwgbx\") pod \"watcher-operator-controller-manager-5595cf6c95-4hcbj\" (UID: \"9ab8a530-fefb-477b-85f7-6f716f684292\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746404 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brbwh\" (UniqueName: \"kubernetes.io/projected/1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63-kube-api-access-brbwh\") pod \"swift-operator-controller-manager-76d5577b-7dv7x\" (UID: \"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746423 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n4js\" (UniqueName: \"kubernetes.io/projected/d6c2b423-4398-416f-bb17-70a8eb814964-kube-api-access-6n4js\") pod \"test-operator-controller-manager-6bb6dcddc-hmtgk\" (UID: \"d6c2b423-4398-416f-bb17-70a8eb814964\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9cw6\" (UniqueName: \"kubernetes.io/projected/368ec6b5-ba12-467e-ab4c-d46a83c31483-kube-api-access-g9cw6\") pod \"telemetry-operator-controller-manager-5ffb97cddf-vf8h7\" (UID: \"368ec6b5-ba12-467e-ab4c-d46a83c31483\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.746502 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42ts\" (UniqueName: \"kubernetes.io/projected/aaf10555-293e-4c2b-baea-ffcdd4eeb046-kube-api-access-f42ts\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.747539 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.747576 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert podName:aaf10555-293e-4c2b-baea-ffcdd4eeb046 nodeName:}" failed. No retries permitted until 2025-10-03 07:13:05.24756151 +0000 UTC m=+1018.674812245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert") pod "openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" (UID: "aaf10555-293e-4c2b-baea-ffcdd4eeb046") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.752450 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mtktz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.752665 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.756094 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.787438 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.788917 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9cw6\" (UniqueName: \"kubernetes.io/projected/368ec6b5-ba12-467e-ab4c-d46a83c31483-kube-api-access-g9cw6\") pod \"telemetry-operator-controller-manager-5ffb97cddf-vf8h7\" (UID: \"368ec6b5-ba12-467e-ab4c-d46a83c31483\") " pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.789084 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42ts\" (UniqueName: \"kubernetes.io/projected/aaf10555-293e-4c2b-baea-ffcdd4eeb046-kube-api-access-f42ts\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.789583 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9nw6\" (UniqueName: \"kubernetes.io/projected/2514440b-b998-4078-834e-642c3bcae80f-kube-api-access-r9nw6\") pod \"ovn-operator-controller-manager-855d7949fc-gnvp7\" (UID: \"2514440b-b998-4078-834e-642c3bcae80f\") " pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.796727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqccz\" (UniqueName: \"kubernetes.io/projected/b842fd5e-6125-40cf-b729-947e54309c87-kube-api-access-qqccz\") pod \"placement-operator-controller-manager-ccbfcb8c-tnrxp\" (UID: \"b842fd5e-6125-40cf-b729-947e54309c87\") " pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.802405 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.810875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.844669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.850674 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4nwr\" (UniqueName: \"kubernetes.io/projected/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-kube-api-access-l4nwr\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.850854 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.850931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.850962 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgbx\" (UniqueName: \"kubernetes.io/projected/9ab8a530-fefb-477b-85f7-6f716f684292-kube-api-access-gwgbx\") pod \"watcher-operator-controller-manager-5595cf6c95-4hcbj\" (UID: \"9ab8a530-fefb-477b-85f7-6f716f684292\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.851016 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brbwh\" (UniqueName: \"kubernetes.io/projected/1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63-kube-api-access-brbwh\") pod \"swift-operator-controller-manager-76d5577b-7dv7x\" (UID: \"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.851254 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n4js\" (UniqueName: \"kubernetes.io/projected/d6c2b423-4398-416f-bb17-70a8eb814964-kube-api-access-6n4js\") pod \"test-operator-controller-manager-6bb6dcddc-hmtgk\" (UID: \"d6c2b423-4398-416f-bb17-70a8eb814964\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.851917 4810 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.851969 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert podName:42afdadf-08ba-4196-a815-a4fc0acf2181 nodeName:}" failed. No retries permitted until 2025-10-03 07:13:05.85194696 +0000 UTC m=+1019.279197695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert") pod "infra-operator-controller-manager-7c9978f67-wqn2b" (UID: "42afdadf-08ba-4196-a815-a4fc0acf2181") : secret "infra-operator-webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.871883 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.886489 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgbx\" (UniqueName: \"kubernetes.io/projected/9ab8a530-fefb-477b-85f7-6f716f684292-kube-api-access-gwgbx\") pod \"watcher-operator-controller-manager-5595cf6c95-4hcbj\" (UID: \"9ab8a530-fefb-477b-85f7-6f716f684292\") " pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.896192 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n4js\" (UniqueName: \"kubernetes.io/projected/d6c2b423-4398-416f-bb17-70a8eb814964-kube-api-access-6n4js\") pod \"test-operator-controller-manager-6bb6dcddc-hmtgk\" (UID: \"d6c2b423-4398-416f-bb17-70a8eb814964\") " pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.897267 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brbwh\" (UniqueName: \"kubernetes.io/projected/1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63-kube-api-access-brbwh\") pod \"swift-operator-controller-manager-76d5577b-7dv7x\" (UID: \"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63\") " pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.898411 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.916352 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.921353 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.924560 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-lj65m" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.930534 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr"] Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.958370 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4nwr\" (UniqueName: \"kubernetes.io/projected/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-kube-api-access-l4nwr\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.958431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcvsf\" (UniqueName: \"kubernetes.io/projected/52bc186b-754c-4fc1-982d-435c345718a7-kube-api-access-mcvsf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bcntr\" (UID: \"52bc186b-754c-4fc1-982d-435c345718a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.960936 4810 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.960948 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:04 crc kubenswrapper[4810]: E1003 07:13:04.960997 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert podName:9ac05823-a077-4e2a-8bf8-b5e9a454bf03 nodeName:}" failed. No retries permitted until 2025-10-03 07:13:05.460976885 +0000 UTC m=+1018.888227620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert") pod "openstack-operator-controller-manager-54df7874c5-l442f" (UID: "9ac05823-a077-4e2a-8bf8-b5e9a454bf03") : secret "webhook-server-cert" not found Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.988144 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:04 crc kubenswrapper[4810]: I1003 07:13:04.998266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4nwr\" (UniqueName: \"kubernetes.io/projected/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-kube-api-access-l4nwr\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.011980 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.036466 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.074715 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcvsf\" (UniqueName: \"kubernetes.io/projected/52bc186b-754c-4fc1-982d-435c345718a7-kube-api-access-mcvsf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bcntr\" (UID: \"52bc186b-754c-4fc1-982d-435c345718a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.126121 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcvsf\" (UniqueName: \"kubernetes.io/projected/52bc186b-754c-4fc1-982d-435c345718a7-kube-api-access-mcvsf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bcntr\" (UID: \"52bc186b-754c-4fc1-982d-435c345718a7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.237169 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.270341 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.282993 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:05 crc kubenswrapper[4810]: E1003 07:13:05.283192 4810 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 07:13:05 crc kubenswrapper[4810]: E1003 07:13:05.283295 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert podName:aaf10555-293e-4c2b-baea-ffcdd4eeb046 nodeName:}" failed. No retries permitted until 2025-10-03 07:13:06.283273613 +0000 UTC m=+1019.710524348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert") pod "openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" (UID: "aaf10555-293e-4c2b-baea-ffcdd4eeb046") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 07:13:05 crc kubenswrapper[4810]: W1003 07:13:05.368029 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701c6350_6581_453d_abc4_728bff24f1a5.slice/crio-81186841ef883ce5b8cc3e103b90e13989756362e9de86ee5cc8a443ec2eedd3 WatchSource:0}: Error finding container 81186841ef883ce5b8cc3e103b90e13989756362e9de86ee5cc8a443ec2eedd3: Status 404 returned error can't find the container with id 81186841ef883ce5b8cc3e103b90e13989756362e9de86ee5cc8a443ec2eedd3 Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.409346 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.470401 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.486538 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.493537 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ac05823-a077-4e2a-8bf8-b5e9a454bf03-cert\") pod \"openstack-operator-controller-manager-54df7874c5-l442f\" (UID: \"9ac05823-a077-4e2a-8bf8-b5e9a454bf03\") " pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.532615 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.608970 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.615647 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p"] Oct 03 07:13:05 crc kubenswrapper[4810]: W1003 07:13:05.616343 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode910dcfe_cedc_4a55_ad55_39adb2422c48.slice/crio-8de6372e3bb3703098e3bf697164c4c1eba54a22eac526c03544b5771ce2cbc1 WatchSource:0}: Error finding container 8de6372e3bb3703098e3bf697164c4c1eba54a22eac526c03544b5771ce2cbc1: Status 404 returned error can't find the container with id 8de6372e3bb3703098e3bf697164c4c1eba54a22eac526c03544b5771ce2cbc1 Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.669488 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.846914 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.874523 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.889151 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c"] Oct 03 07:13:05 crc kubenswrapper[4810]: W1003 07:13:05.892447 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ef1249c_27f8_4cc0_8134_690b6b8773d1.slice/crio-e1b8cbcbc8e421d6e03e482270a7f4547b0d982fbe297fc495d0cafee7ec3e40 WatchSource:0}: Error finding container e1b8cbcbc8e421d6e03e482270a7f4547b0d982fbe297fc495d0cafee7ec3e40: Status 404 returned error can't find the container with id e1b8cbcbc8e421d6e03e482270a7f4547b0d982fbe297fc495d0cafee7ec3e40 Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.895438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.907772 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42afdadf-08ba-4196-a815-a4fc0acf2181-cert\") pod \"infra-operator-controller-manager-7c9978f67-wqn2b\" (UID: \"42afdadf-08ba-4196-a815-a4fc0acf2181\") " pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.908505 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2"] Oct 03 07:13:05 crc kubenswrapper[4810]: W1003 07:13:05.921450 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21a68e29_471e_428d_9491_ae4b33a01e8a.slice/crio-5d231705a1b79f8e4f4972c638408f714cbd075efbec7a7323c39db66efd8ad0 WatchSource:0}: Error finding container 5d231705a1b79f8e4f4972c638408f714cbd075efbec7a7323c39db66efd8ad0: Status 404 returned error can't find the container with id 5d231705a1b79f8e4f4972c638408f714cbd075efbec7a7323c39db66efd8ad0 Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.928767 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.937084 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws"] Oct 03 07:13:05 crc kubenswrapper[4810]: I1003 07:13:05.997838 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx"] Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.008774 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x"] Oct 03 07:13:06 crc kubenswrapper[4810]: W1003 07:13:06.016360 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb842fd5e_6125_40cf_b729_947e54309c87.slice/crio-11615bde12927170f16e4daf49acc57a61af9fb646405f5c188d4627b186525f WatchSource:0}: Error finding container 11615bde12927170f16e4daf49acc57a61af9fb646405f5c188d4627b186525f: Status 404 returned error can't find the container with id 11615bde12927170f16e4daf49acc57a61af9fb646405f5c188d4627b186525f Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.016466 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7"] Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.018117 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-brbwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-76d5577b-7dv7x_openstack-operators(1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.018868 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qqccz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-ccbfcb8c-tnrxp_openstack-operators(b842fd5e-6125-40cf-b729-947e54309c87): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.018989 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9cw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5ffb97cddf-vf8h7_openstack-operators(368ec6b5-ba12-467e-ab4c-d46a83c31483): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.021994 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp"] Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.090720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" event={"ID":"4686e0fc-c208-4a38-a7ae-33adc6123d0d","Type":"ContainerStarted","Data":"3d93a4a05f368a5fe5444775a6401da49eaac79a1182c4f2b7c44fc8839d3506"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.093231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" event={"ID":"7ed3bd35-9f00-40d3-9ed0-a111c4131b11","Type":"ContainerStarted","Data":"fa9bfffc11ea05e1f09778f4359c1f757b7ab0914ba335b6f2208762e1192c1f"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.093989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" event={"ID":"84c853db-41b6-4013-b6a6-b9f6c3fa74e3","Type":"ContainerStarted","Data":"e71e18fbc96b264aa4dd35425251eadc0a52090be9c739149c7e4e7f126cedab"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.094749 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" event={"ID":"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63","Type":"ContainerStarted","Data":"00e8789f6504dc6ccd1e93d7a8c5b12816206bb2502f90efa692a0948b51bef2"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.101425 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" event={"ID":"6ef1249c-27f8-4cc0-8134-690b6b8773d1","Type":"ContainerStarted","Data":"e1b8cbcbc8e421d6e03e482270a7f4547b0d982fbe297fc495d0cafee7ec3e40"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.103700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" event={"ID":"5fa60d72-b0e4-4a27-b2af-d228cd9db5da","Type":"ContainerStarted","Data":"7710db045ad22cfb1f01fcafed842e4cadf7920be6444265620686ecf0a19118"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.105457 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" event={"ID":"e910dcfe-cedc-4a55-ad55-39adb2422c48","Type":"ContainerStarted","Data":"8de6372e3bb3703098e3bf697164c4c1eba54a22eac526c03544b5771ce2cbc1"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.112980 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" event={"ID":"368ec6b5-ba12-467e-ab4c-d46a83c31483","Type":"ContainerStarted","Data":"cfb0c9aaa0a7bcb0d66276c01b782dadc3abacb9eff1e53383f218359491c5e1"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.119738 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" event={"ID":"b1578049-8763-4c26-b149-8497d94da92e","Type":"ContainerStarted","Data":"659909cd44874945a338ed01e7bc5c84f843985f0ac7292b391779b3621854bc"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.123274 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" event={"ID":"9aeddbe0-0fe7-451c-afaf-bd0074505142","Type":"ContainerStarted","Data":"a6ec33aae6291cbba58ec4bb130b6be4c7aecca16072a63cdda67b0d38148955"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.123549 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr"] Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.127286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" event={"ID":"d812ca0a-067a-4c91-a460-d340ef72d051","Type":"ContainerStarted","Data":"74a89c6e1fe37393ca3b2568c6ced02c1fb735c79ae31d98f1276b5c3d9e77ee"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.147839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" event={"ID":"701c6350-6581-453d-abc4-728bff24f1a5","Type":"ContainerStarted","Data":"81186841ef883ce5b8cc3e103b90e13989756362e9de86ee5cc8a443ec2eedd3"} Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.149121 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mcvsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-bcntr_openstack-operators(52bc186b-754c-4fc1-982d-435c345718a7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.149476 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj"] Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.150203 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" podUID="52bc186b-754c-4fc1-982d-435c345718a7" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.151089 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" event={"ID":"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d","Type":"ContainerStarted","Data":"b3b697af2040413b109a7a47e836fb976a84b66d991fa80c1d1a98b7c20b80a9"} Oct 03 07:13:06 crc kubenswrapper[4810]: W1003 07:13:06.157173 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab8a530_fefb_477b_85f7_6f716f684292.slice/crio-a0912c9411ec18b385c85cc522cc19ae634064916f9ce2797b1f6de0ac933155 WatchSource:0}: Error finding container a0912c9411ec18b385c85cc522cc19ae634064916f9ce2797b1f6de0ac933155: Status 404 returned error can't find the container with id a0912c9411ec18b385c85cc522cc19ae634064916f9ce2797b1f6de0ac933155 Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.158022 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" event={"ID":"21a68e29-471e-428d-9491-ae4b33a01e8a","Type":"ContainerStarted","Data":"5d231705a1b79f8e4f4972c638408f714cbd075efbec7a7323c39db66efd8ad0"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.158800 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" event={"ID":"b842fd5e-6125-40cf-b729-947e54309c87","Type":"ContainerStarted","Data":"11615bde12927170f16e4daf49acc57a61af9fb646405f5c188d4627b186525f"} Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.159534 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gwgbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5595cf6c95-4hcbj_openstack-operators(9ab8a530-fefb-477b-85f7-6f716f684292): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.166424 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.166992 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" event={"ID":"6c644f74-4957-40ad-8286-b172b802a323","Type":"ContainerStarted","Data":"e813201c6611344f982905be41be6498ef7429d8e718c76e1c7224d9ca119b12"} Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.170981 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk"] Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.174850 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7"] Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.192880 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f"] Oct 03 07:13:06 crc kubenswrapper[4810]: W1003 07:13:06.202480 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2514440b_b998_4078_834e_642c3bcae80f.slice/crio-1f8d2c3b49b6f44e6ba8fc5359bf00b71167d1912b1b0524aa832fce8e59bd28 WatchSource:0}: Error finding container 1f8d2c3b49b6f44e6ba8fc5359bf00b71167d1912b1b0524aa832fce8e59bd28: Status 404 returned error can't find the container with id 1f8d2c3b49b6f44e6ba8fc5359bf00b71167d1912b1b0524aa832fce8e59bd28 Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.217338 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9nw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-855d7949fc-gnvp7_openstack-operators(2514440b-b998-4078-834e-642c3bcae80f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 07:13:06 crc kubenswrapper[4810]: W1003 07:13:06.226680 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac05823_a077_4e2a_8bf8_b5e9a454bf03.slice/crio-6f817f140eee101d748e834bf69914386f79a397300ae5a953a06ea51de29ee7 WatchSource:0}: Error finding container 6f817f140eee101d748e834bf69914386f79a397300ae5a953a06ea51de29ee7: Status 404 returned error can't find the container with id 6f817f140eee101d748e834bf69914386f79a397300ae5a953a06ea51de29ee7 Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.303051 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.308234 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf10555-293e-4c2b-baea-ffcdd4eeb046-cert\") pod \"openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48\" (UID: \"aaf10555-293e-4c2b-baea-ffcdd4eeb046\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.588802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.664622 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b"] Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.669622 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" podUID="b842fd5e-6125-40cf-b729-947e54309c87" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.669717 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" podUID="1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.744558 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" podUID="368ec6b5-ba12-467e-ab4c-d46a83c31483" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.751773 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" podUID="9ab8a530-fefb-477b-85f7-6f716f684292" Oct 03 07:13:06 crc kubenswrapper[4810]: E1003 07:13:06.863838 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" podUID="2514440b-b998-4078-834e-642c3bcae80f" Oct 03 07:13:06 crc kubenswrapper[4810]: I1003 07:13:06.888677 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48"] Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.175220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" event={"ID":"52bc186b-754c-4fc1-982d-435c345718a7","Type":"ContainerStarted","Data":"ac27daabf513f2309f4151da0c2b3884a5a83c733316d282136b0f576e4e10de"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.177004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" event={"ID":"42afdadf-08ba-4196-a815-a4fc0acf2181","Type":"ContainerStarted","Data":"76789eeda13cc6ebf664dc06e3ef84e2fa6508122bf698002b95e778d16c47e5"} Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.178915 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" podUID="52bc186b-754c-4fc1-982d-435c345718a7" Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.182106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" event={"ID":"9ab8a530-fefb-477b-85f7-6f716f684292","Type":"ContainerStarted","Data":"13282c4e33b3a2764cfca7099879eccd4971c2cfeb8c55ded1521e589820490d"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.182143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" event={"ID":"9ab8a530-fefb-477b-85f7-6f716f684292","Type":"ContainerStarted","Data":"a0912c9411ec18b385c85cc522cc19ae634064916f9ce2797b1f6de0ac933155"} Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.183125 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" podUID="9ab8a530-fefb-477b-85f7-6f716f684292" Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.183924 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" event={"ID":"368ec6b5-ba12-467e-ab4c-d46a83c31483","Type":"ContainerStarted","Data":"bffe11888c186d0d380a522d009511a1a5d071d6dfef9c48bab346913a16dde9"} Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.188837 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" podUID="368ec6b5-ba12-467e-ab4c-d46a83c31483" Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.201518 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" event={"ID":"9ac05823-a077-4e2a-8bf8-b5e9a454bf03","Type":"ContainerStarted","Data":"2c476da5e325977ec9a66d5a02534364ad61ddd819b334add6a5f166f13ce5ad"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.205408 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" event={"ID":"9ac05823-a077-4e2a-8bf8-b5e9a454bf03","Type":"ContainerStarted","Data":"6f817f140eee101d748e834bf69914386f79a397300ae5a953a06ea51de29ee7"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.210623 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" event={"ID":"d6c2b423-4398-416f-bb17-70a8eb814964","Type":"ContainerStarted","Data":"6b921ce0e0cfb14e569bd17920cee2c90b9d1172abc6dad5f2e4515544befe65"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.213272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" event={"ID":"b842fd5e-6125-40cf-b729-947e54309c87","Type":"ContainerStarted","Data":"6ac37704a8ed6d2ba72eb850dbbf1094b4752efc9b546dbf47a4afea82e44b2f"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.215400 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" event={"ID":"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63","Type":"ContainerStarted","Data":"506aa87f5b8cbaa4a0296b93b5fc9cec4a61e80ac2287a8dd62eca229d10555b"} Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.217422 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" podUID="1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63" Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.218748 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" podUID="b842fd5e-6125-40cf-b729-947e54309c87" Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.219861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" event={"ID":"aaf10555-293e-4c2b-baea-ffcdd4eeb046","Type":"ContainerStarted","Data":"65fabdb6a5e435ea0e57792b3290334f286b64ea352c2639bff1ccbb57939c93"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.221629 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" event={"ID":"2514440b-b998-4078-834e-642c3bcae80f","Type":"ContainerStarted","Data":"64fb4173d712273112a49271cc39bbdf2cde02030405bcf33ae1082be5384c8c"} Oct 03 07:13:07 crc kubenswrapper[4810]: I1003 07:13:07.221678 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" event={"ID":"2514440b-b998-4078-834e-642c3bcae80f","Type":"ContainerStarted","Data":"1f8d2c3b49b6f44e6ba8fc5359bf00b71167d1912b1b0524aa832fce8e59bd28"} Oct 03 07:13:07 crc kubenswrapper[4810]: E1003 07:13:07.230732 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" podUID="2514440b-b998-4078-834e-642c3bcae80f" Oct 03 07:13:08 crc kubenswrapper[4810]: I1003 07:13:08.235377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" event={"ID":"9ac05823-a077-4e2a-8bf8-b5e9a454bf03","Type":"ContainerStarted","Data":"c773209bbb1cfc2cb9046f41fdb2832c2dd1a79322e6471dc25e52074ab71044"} Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.238524 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:018151bd5ff830ec03c6b8e3d53cfb9456ca6e1e34793bdd4f7edd39a0146fa6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" podUID="9ab8a530-fefb-477b-85f7-6f716f684292" Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.238743 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8f5eee2eb7b77432ef1a88ed693ff981514359dfc808581f393bcef252de5cfa\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" podUID="368ec6b5-ba12-467e-ab4c-d46a83c31483" Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.238787 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" podUID="2514440b-b998-4078-834e-642c3bcae80f" Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.240312 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" podUID="b842fd5e-6125-40cf-b729-947e54309c87" Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.240333 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" podUID="52bc186b-754c-4fc1-982d-435c345718a7" Oct 03 07:13:08 crc kubenswrapper[4810]: E1003 07:13:08.240334 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" podUID="1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63" Oct 03 07:13:08 crc kubenswrapper[4810]: I1003 07:13:08.372255 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" podStartSLOduration=4.372240818 podStartE2EDuration="4.372240818s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:13:08.354539645 +0000 UTC m=+1021.781790410" watchObservedRunningTime="2025-10-03 07:13:08.372240818 +0000 UTC m=+1021.799491553" Oct 03 07:13:09 crc kubenswrapper[4810]: I1003 07:13:09.243807 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:15 crc kubenswrapper[4810]: I1003 07:13:15.678779 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-54df7874c5-l442f" Oct 03 07:13:30 crc kubenswrapper[4810]: E1003 07:13:30.584138 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee" Oct 03 07:13:30 crc kubenswrapper[4810]: E1003 07:13:30.587044 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:d1fad97d2cd602a4f7b6fd6c202464ac117b20e6608c17aa04cadbceb78a498d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:1c99923410d4cd0a721d2cc8a51d91d3ac800d5fda508c972ebe1e85ed2ca4d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:af4e2467469edf3b1fa739ef819ead98dfa934542ae40ec3266d58f66ba44f99,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:99f246f3b9bad7c46b671da12cd166614f0573b3dbf0aa04f4b32d4a9f5a81c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:d617f09ab1f6ef522c6f70db597cf20ab79ccebf25e225653cbf2e999354a5c0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1c73b7b1034524ecfb36ce1eaa37ecbbcd5cb3f7fee0149b3bce0b0170bae8ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:9e14abeaab473b6731830d9c5bf383bb52111c919c787aee06b833f8cd3f83b1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:0838a5c5edf54c1c8af59c93955f26e4eda6645297058780e0f61c77b65683d9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:c50baa554100db160210b65733f71d6d128e38f96fa0552819854c62ede75953,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:e43273f867316a0e03469d82dc37487d3cdd2b08b4a153ba270c7cae1749bf92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:de50c7dd282aa3898f1d0a31ecb2a300688f1f234662e6bbe12f35f88b484083,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:31c0d98fec7ff16416903874af0addeff03a7e72ede256990f2a71589e8be5ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:ac586b71d28a6240b29f4b464b19fea812ffc81e1182d172570b4be5ac58ea70,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:a5df039c808a65a273073128a627d6700897d6ebf81a9c62412c7d06be3b9a6e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:8f09cdc578caa07e0b5a9ec4e96a251a6d7dd43b2ef1edacb56543c997c259e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:e870d0a1b0c758601a067bfccc539ca04222e0c867872f679cea5833e0fcbf94,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:8f112731484f983f272f4c95558ffa098e96e610ddc5130ee0f2b2a239e9058a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:277ac4620d95ce3fe2f552f59b82b70962ba024d498710adc45b863bcc7244ff,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:09eebb1f87217fbb0249f4ebc19192cd282833aac27103081160b8949dd4361c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:e17eb4221e8981df97744e5168a8c759abcd925c2a483d04e3fdecd78128dae4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:02f99d84c8cc2c59ac4b8d98f219a1138b0aed8e50f91f9326ef55db5c187cd8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:7a636c7f518127d4292aa5417113fd611b85ad49ddbc8273455aa2fe5066a533,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:61f617fd809b55b2eceeec84b3283757af80d1001659e80877ac69e9643ba89f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:0b083fceb6e323a30f4c7308a275ea88243420ef38df77ac322af302c4c4dd2d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:9e173574f9216e5c42498c3794075ead54b6850c66094c4be628b52063f5814c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:980d0d43a83e61b74634b46864c2070fcb26348f8bc5a3375f161703e4041d3d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:d561737cf54869c67a819635c4a10ca4a9ed21cc6046ffd4f17301670d9a25fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:941076bbb1577abd91f42e0f19b0a191f7e393135d823ed203b122875033888b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:2133db6669a24570a266e7c053fc71bbfadd16cd9cd0bc8b87633e73c03c4719,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:55682010f0f5aea02f59df1e0a827cc6915048b7545c25432fb0cb8501898d0b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:814536e8e4848f6612cd4ada641d46ae7d766878b89918fc5df11f3930747d3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:a4f12a27e60f17034ba47f57dba0c5ae3f9e3c6c681f2e417bb87cb132f502e7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:17b8c6c9fbcc7092cba64a264adb9af6decd7db24ee2c60607a9045d55031b51,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:f0864392605772b30f07dcb67ec8bb75d5b779756c537983377044d899c1b099,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:d9a44db937205e4c4f2cd2d247d230de2eb9207089f35a7ae7cfb11301406fac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:1ab1deb86e7e5ba67b4cd9f5974de6707e5a5948e8f01fc1156dbf5e452340a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:a895c2b3a12aa21f9541a76213b6058ce3252aca002d66025d5935f4ea5873c7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:e7c778fd348f881fea490bb9ddf465347068a60fcd65f9cbfedb615815bba2a2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:a21c91d6927d863be8aef3023a527bc3466a0ddffc018df0c970ce14396ceee0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:7053c79b8354195fd09a5ea1347ad49a35443923d4e4578f80615c63d83313d3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:4626ebaa9dbe27fc95b31a48e69397fadef7c9779670c01555f872873c393f74,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:c840d7e9775d7f7ed1c6700d973bef79318fe92ac6fc8ed0616dcec13ef95c92,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:fcb50aade382ff516554b84b45c742a5adafb460fd67bd0fa2fc7cbb30adf5c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:54373b05fcd33538d153507943da0c118e303a5c61a19c6bbe79a0786fe8ce1d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:8c9be58185245280d7282e8973cc6e23e6b08520ce126aeb91cfbcef0c144690,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:676ba6130835d00defc3214769d5fe1827ee41420a05f8556f361aac502a7efc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:3dbd2ac58b5f64ab3cf3eef3c44a52f0ccd363568c0739a5d18d6b9c9edddf5d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:fded6f454a54e601894e06989243e8896f43940c77cd8f4c904fe43c120b1595,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:5d10c016b13499110b5f9ca2bccfaf6d2fd4298c9f02580d7208fe91850da0a6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:43f2c4ec2e38934288015cb5d5ae92941e8b3fa9a613539175641e2c16cfc0cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:c506a314e354f1ab274c46f9969b254f820e7515bbd9a24c9877dfbb10ece37e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:96d4b699758dd3d408b4c672dbe4392fd09783b4dc60783389905d7220b6524c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:7a0f3de7dda85fba7ad2929c7b01a2d42c11df9fe83f47a8e499a9da51e7f48c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:49b5ae7f895266b90cf3c02503fb7146726e59ad782fdf88112ad6954112d7e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:19b3d48b3c29eaa3a6d76fc145e212389f245c077bbf24eb5c1de0c96f3f7190,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:227891a9f4821a92c49ddc27301303287d5632b6ac199e9fe402581f1831ec01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:26e3ada4b9fee357ef8bbb1c342b38c49c096ede8a498116e3753ad45354fb47,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:2789b45ae2a5a9a80e4864e691f9e32fb9c9e1938cf92bda7c07defdbc78cdc2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:8df8259e737625667b13897dc0094bf3d7ced54f414dda93293ad4cb68af1d43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:2fe4f8c71e11a926450d6553e5cb5c7b2db5d0de8426aa969f30d3d566114ff8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:ab5265aef98352336f23b18080f3ba110250859dc0edc20819348311a4a53044,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:a3c1b94a285064d150145340c06ad5b0afc4aa20caa74523f3972c19b1d1ea61,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:129e24971fee94cc60b5f440605f1512fb932a884e38e64122f38f11f942e3b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:b96baffbb926f93936bd52f2a1ef4fe1d31bb469d6489e9fb67bf00b99156551,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:d659d1ffbbaff7c76fc96e6600dc9b03c53af2c9d63cfb4626dfb5831b7b1ad7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:e3fcd72e1a2790ca7db5d5c40c1ae597de4b020dd51debcab063352e6e5f7d79,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:2504c0db038b850cdd6057fc50e109715a4453c386e4f4d4f901a20dc7b2036a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:35c124624fd84930496975032e22d57e517c5958e71ba63124a306a5949c71d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e3accbf4293c544194bd2151d4d0bd8b26828ddacda968bad5d5a6f05c2406db,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:75ce8c4f9c68aaba6cab59749e726b2f94d29ba7b7897b18112fe1bd350efd8f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:6390af78808d7cd69a4f5c7cb88f47690e54c9b8838b9461f4b21c4127ce770c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:14489a8a681c482a643cb47fa90d0a3596b4570e13cfc760541ac80d37cd31b3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:87367a67c7cb73476fb8d08ba108da843ac61170381458608e778a33c024c0c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:d6123a9349d422888df97ee72d32643dd534f81c521f6f313c5d5e64e2db60c1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:b273fd1e1da4190dc4cc67469d180b66b5a22eb6ec9afc76ef36dd6ea2beaea5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:9561306ec9455914cd05a0a0b3e56d72c7164aa41d0f0ef9b03ac7d7343538b8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:1115e5a2dce397b4a34a082cba1937903818ab5928048fcf775c4a4e6dda2d07,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f42ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48_openstack-operators(aaf10555-293e-4c2b-baea-ffcdd4eeb046): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:13:32 crc kubenswrapper[4810]: E1003 07:13:32.059174 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7" Oct 03 07:13:32 crc kubenswrapper[4810]: E1003 07:13:32.059556 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkhdw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-6c9969c6c6-zvpws_openstack-operators(21a68e29-471e-428d-9491-ae4b33a01e8a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:13:32 crc kubenswrapper[4810]: I1003 07:13:32.088249 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:13:32 crc kubenswrapper[4810]: I1003 07:13:32.088301 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:13:36 crc kubenswrapper[4810]: E1003 07:13:36.080666 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" podUID="aaf10555-293e-4c2b-baea-ffcdd4eeb046" Oct 03 07:13:36 crc kubenswrapper[4810]: E1003 07:13:36.267443 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" podUID="21a68e29-471e-428d-9491-ae4b33a01e8a" Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.466823 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" event={"ID":"d812ca0a-067a-4c91-a460-d340ef72d051","Type":"ContainerStarted","Data":"8f857ed1e9227be61301db3559454dc3f2eb5f73b3e5cf583beb7057b64233ef"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.468942 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" event={"ID":"7ed3bd35-9f00-40d3-9ed0-a111c4131b11","Type":"ContainerStarted","Data":"18bfcf2c60443db2649d38972493468f7f1e001a75aa8f2636400a4463edecd0"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.488439 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" event={"ID":"d6c2b423-4398-416f-bb17-70a8eb814964","Type":"ContainerStarted","Data":"53668c90968112f0b2ee732fd83415149fe4ba8bb415f5c442daffaf03ba555e"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.498776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" event={"ID":"21a68e29-471e-428d-9491-ae4b33a01e8a","Type":"ContainerStarted","Data":"48fbec6bb3eceac2d9d66a688ff7dbdacee6092759a8ccc928d3a5e52ab3028d"} Oct 03 07:13:36 crc kubenswrapper[4810]: E1003 07:13:36.501634 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" podUID="21a68e29-471e-428d-9491-ae4b33a01e8a" Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.504261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" event={"ID":"6ef1249c-27f8-4cc0-8134-690b6b8773d1","Type":"ContainerStarted","Data":"8bb095218306d2f95fb6181c0ecf8e354b80792114b6aac0990452dc4d6ba812"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.512453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" event={"ID":"42afdadf-08ba-4196-a815-a4fc0acf2181","Type":"ContainerStarted","Data":"3efcb55a509ac417ec8c6fada6a07436ea5da64e1332f3c9b9c01b0287535dc6"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.515838 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" event={"ID":"368ec6b5-ba12-467e-ab4c-d46a83c31483","Type":"ContainerStarted","Data":"6daad7a4925a17b85f45e6448589fa6a640f48349b5ecbede95de400bfc14440"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.516503 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.552156 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" event={"ID":"84c853db-41b6-4013-b6a6-b9f6c3fa74e3","Type":"ContainerStarted","Data":"04b49ed15cceb5d4bd66f48d3cf11265a8b40d43bca8eaf01ec50cf9d8e11a54"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.554883 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" podStartSLOduration=2.727220176 podStartE2EDuration="32.55486962s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.018931433 +0000 UTC m=+1019.446182168" lastFinishedPulling="2025-10-03 07:13:35.846580877 +0000 UTC m=+1049.273831612" observedRunningTime="2025-10-03 07:13:36.552615849 +0000 UTC m=+1049.979866584" watchObservedRunningTime="2025-10-03 07:13:36.55486962 +0000 UTC m=+1049.982120355" Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.573848 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" event={"ID":"701c6350-6581-453d-abc4-728bff24f1a5","Type":"ContainerStarted","Data":"3bdd9adffe201533f85dcb76185a9db06aac79664ea8bc64ef7576b7aa38cb56"} Oct 03 07:13:36 crc kubenswrapper[4810]: I1003 07:13:36.579377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" event={"ID":"aaf10555-293e-4c2b-baea-ffcdd4eeb046","Type":"ContainerStarted","Data":"5011e80d4c66d6c347d1e6c7b9e5cb552f10f58e08a485bd93bc8a4603370091"} Oct 03 07:13:36 crc kubenswrapper[4810]: E1003 07:13:36.584089 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" podUID="aaf10555-293e-4c2b-baea-ffcdd4eeb046" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.591492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" event={"ID":"6ef1249c-27f8-4cc0-8134-690b6b8773d1","Type":"ContainerStarted","Data":"7d29cb89fef39203f3be21d499b31bda5d39afcc2a1faac3e571798c5b296740"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.591693 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.595830 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" event={"ID":"b842fd5e-6125-40cf-b729-947e54309c87","Type":"ContainerStarted","Data":"5d1f03db10fb5ecdae545d2a8e8dcf3a105b033cc5d3c27e2605226a6cd97ebe"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.596265 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.597839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" event={"ID":"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d","Type":"ContainerStarted","Data":"72c8d6f575b6e3ec279ab536ba270af237f04f64d08647200bf13c22a0fc147d"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.599401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" event={"ID":"b1578049-8763-4c26-b149-8497d94da92e","Type":"ContainerStarted","Data":"dd7e01494b63a49be075273cfbb57e40f41979037d03615dbb394870ae5a8e86"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.600799 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" event={"ID":"52bc186b-754c-4fc1-982d-435c345718a7","Type":"ContainerStarted","Data":"fc0668b1a2f61c50e11aab7f321d85e4a9157f89a7247aad244c57c647ef526d"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.602492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" event={"ID":"5fa60d72-b0e4-4a27-b2af-d228cd9db5da","Type":"ContainerStarted","Data":"ff42589d41455b1f1b86f157e377a883f75ab80aece2b392d8deed458fa22e34"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.603872 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" event={"ID":"84c853db-41b6-4013-b6a6-b9f6c3fa74e3","Type":"ContainerStarted","Data":"d8291b2c62e36b2366652d87b49c99ecb25bcbeb58df7b8da0c1cc1596fbd939"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.604402 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.605412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" event={"ID":"4686e0fc-c208-4a38-a7ae-33adc6123d0d","Type":"ContainerStarted","Data":"3eecaf348dd682eb2501f2dbaf49ee77553bcba8399aa6074aa8e7e529249067"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.607336 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" event={"ID":"7ed3bd35-9f00-40d3-9ed0-a111c4131b11","Type":"ContainerStarted","Data":"1a07ec8a18e8878e641f413459b52286b63e46037f560ab9fef8f43f2a57bc4d"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.607725 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.609710 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" event={"ID":"6c644f74-4957-40ad-8286-b172b802a323","Type":"ContainerStarted","Data":"e4b3233e7cfd5ace801f9a818d6b2241823119c76f124ac0e3b46ffba6a67387"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.610856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" event={"ID":"9ab8a530-fefb-477b-85f7-6f716f684292","Type":"ContainerStarted","Data":"9c2755e06c490413c54cbe393d235b8b65a078507ec706627c762975aac20a4a"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.611247 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.612815 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" event={"ID":"701c6350-6581-453d-abc4-728bff24f1a5","Type":"ContainerStarted","Data":"e569c7ec9baef5f47c9c62bccddc74fa7ef0cb03ddafc563607359a203481fe8"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.613269 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.616790 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" event={"ID":"d6c2b423-4398-416f-bb17-70a8eb814964","Type":"ContainerStarted","Data":"5d885d2b3aafd75766e564536a142493039d291045a3e2153001b32e8b40f200"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.617085 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.618615 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" event={"ID":"2514440b-b998-4078-834e-642c3bcae80f","Type":"ContainerStarted","Data":"826bbc726955158f51ef24bfbe18790e0162951d6b56d23b4f0454bb61bb30fa"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.619006 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.621080 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" podStartSLOduration=5.321953842 podStartE2EDuration="33.621062523s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.89644353 +0000 UTC m=+1019.323694275" lastFinishedPulling="2025-10-03 07:13:34.195552211 +0000 UTC m=+1047.622802956" observedRunningTime="2025-10-03 07:13:37.618713788 +0000 UTC m=+1051.045964563" watchObservedRunningTime="2025-10-03 07:13:37.621062523 +0000 UTC m=+1051.048313258" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.629079 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" event={"ID":"e910dcfe-cedc-4a55-ad55-39adb2422c48","Type":"ContainerStarted","Data":"4e80edf61f94186eb1e5b33db01876e49cc7822899d372fa23029d57ae099b20"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.630231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" event={"ID":"9aeddbe0-0fe7-451c-afaf-bd0074505142","Type":"ContainerStarted","Data":"d90a59d6065d78809e0e2091bfe39562c87c05ee937de4e0a4141ac0d6223b5f"} Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.631453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" event={"ID":"1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63","Type":"ContainerStarted","Data":"08d0853075ca3019271df143af6aeee57c2b0d9edab417777317e80c3acb6ff5"} Oct 03 07:13:37 crc kubenswrapper[4810]: E1003 07:13:37.632908 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:f50229c8a33fd581bccbe5f34bbaf3936c1b454802e755c9b48b40b76a8239ee\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" podUID="aaf10555-293e-4c2b-baea-ffcdd4eeb046" Oct 03 07:13:37 crc kubenswrapper[4810]: E1003 07:13:37.633452 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:516f76ed86dd34225e6d0309451c7886bb81ff69032ba28125ae4d0cec54bce7\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" podUID="21a68e29-471e-428d-9491-ae4b33a01e8a" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.638222 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" podStartSLOduration=3.96568653 podStartE2EDuration="33.63820388s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.159396407 +0000 UTC m=+1019.586647142" lastFinishedPulling="2025-10-03 07:13:35.831913757 +0000 UTC m=+1049.259164492" observedRunningTime="2025-10-03 07:13:37.63635378 +0000 UTC m=+1051.063604515" watchObservedRunningTime="2025-10-03 07:13:37.63820388 +0000 UTC m=+1051.065454615" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.664578 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" podStartSLOduration=20.424983549 podStartE2EDuration="33.66455951s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.191815763 +0000 UTC m=+1019.619066498" lastFinishedPulling="2025-10-03 07:13:19.431391724 +0000 UTC m=+1032.858642459" observedRunningTime="2025-10-03 07:13:37.660868909 +0000 UTC m=+1051.088119644" watchObservedRunningTime="2025-10-03 07:13:37.66455951 +0000 UTC m=+1051.091810245" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.683830 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bcntr" podStartSLOduration=4.00154649 podStartE2EDuration="33.683808355s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.149014644 +0000 UTC m=+1019.576265379" lastFinishedPulling="2025-10-03 07:13:35.831276489 +0000 UTC m=+1049.258527244" observedRunningTime="2025-10-03 07:13:37.679374024 +0000 UTC m=+1051.106624759" watchObservedRunningTime="2025-10-03 07:13:37.683808355 +0000 UTC m=+1051.111059090" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.695802 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" podStartSLOduration=8.087478372 podStartE2EDuration="34.695788502s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.372177429 +0000 UTC m=+1018.799428164" lastFinishedPulling="2025-10-03 07:13:31.980487559 +0000 UTC m=+1045.407738294" observedRunningTime="2025-10-03 07:13:37.692909194 +0000 UTC m=+1051.120159929" watchObservedRunningTime="2025-10-03 07:13:37.695788502 +0000 UTC m=+1051.123039227" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.720366 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" podStartSLOduration=22.983700064 podStartE2EDuration="34.720349393s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.548112602 +0000 UTC m=+1018.975363337" lastFinishedPulling="2025-10-03 07:13:17.284761931 +0000 UTC m=+1030.712012666" observedRunningTime="2025-10-03 07:13:37.709476916 +0000 UTC m=+1051.136727651" watchObservedRunningTime="2025-10-03 07:13:37.720349393 +0000 UTC m=+1051.147600118" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.733382 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" podStartSLOduration=4.080339031 podStartE2EDuration="33.733362268s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.018784539 +0000 UTC m=+1019.446035274" lastFinishedPulling="2025-10-03 07:13:35.671807776 +0000 UTC m=+1049.099058511" observedRunningTime="2025-10-03 07:13:37.732327649 +0000 UTC m=+1051.159578394" watchObservedRunningTime="2025-10-03 07:13:37.733362268 +0000 UTC m=+1051.160613003" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.758578 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" podStartSLOduration=6.484522805 podStartE2EDuration="33.758556926s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.893992783 +0000 UTC m=+1019.321243518" lastFinishedPulling="2025-10-03 07:13:33.168026904 +0000 UTC m=+1046.595277639" observedRunningTime="2025-10-03 07:13:37.754843334 +0000 UTC m=+1051.182094079" watchObservedRunningTime="2025-10-03 07:13:37.758556926 +0000 UTC m=+1051.185807661" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.783279 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" podStartSLOduration=4.169164886 podStartE2EDuration="33.78326192s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.217157694 +0000 UTC m=+1019.644408429" lastFinishedPulling="2025-10-03 07:13:35.831254728 +0000 UTC m=+1049.258505463" observedRunningTime="2025-10-03 07:13:37.778577052 +0000 UTC m=+1051.205827777" watchObservedRunningTime="2025-10-03 07:13:37.78326192 +0000 UTC m=+1051.210512655" Oct 03 07:13:37 crc kubenswrapper[4810]: I1003 07:13:37.816795 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" podStartSLOduration=4.00452607 podStartE2EDuration="33.816773854s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.017959586 +0000 UTC m=+1019.445210321" lastFinishedPulling="2025-10-03 07:13:35.83020736 +0000 UTC m=+1049.257458105" observedRunningTime="2025-10-03 07:13:37.811036868 +0000 UTC m=+1051.238287613" watchObservedRunningTime="2025-10-03 07:13:37.816773854 +0000 UTC m=+1051.244024589" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.640450 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" event={"ID":"b1578049-8763-4c26-b149-8497d94da92e","Type":"ContainerStarted","Data":"5b3745ba6465713bb298f33b25e725d53c45e2bda03939230f864cef6b945217"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.640818 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.642855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" event={"ID":"d812ca0a-067a-4c91-a460-d340ef72d051","Type":"ContainerStarted","Data":"aa2319d2c14e46dd39ede9e08dad96bb4ec7d30acbe9eb054ee2700125d56fc4"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.643092 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.645469 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" event={"ID":"e910dcfe-cedc-4a55-ad55-39adb2422c48","Type":"ContainerStarted","Data":"97f6f433254cbd8b17c6a7d32c7bab1df852c9744c6a9fa27e897b085438350f"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.645584 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.647205 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" event={"ID":"6c644f74-4957-40ad-8286-b172b802a323","Type":"ContainerStarted","Data":"daff92d4642b3deee4a4940d041f66cc1a5cc8b81d380069a97132f37f8a3175"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.647322 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.648844 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" event={"ID":"4686e0fc-c208-4a38-a7ae-33adc6123d0d","Type":"ContainerStarted","Data":"e6f9a820b5fb49387fd3d2ce1fcfcb406e7412b82024f277e75d4a71d5514565"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.648929 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.650544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" event={"ID":"9aeddbe0-0fe7-451c-afaf-bd0074505142","Type":"ContainerStarted","Data":"8a57d89cd16753d00a9ad4599dee793fc8982636937d6560cb34d36d282c1871"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.650703 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.652293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" event={"ID":"5fa60d72-b0e4-4a27-b2af-d228cd9db5da","Type":"ContainerStarted","Data":"eaaf2c37a52645903e5243a07522d80da57692ca002cfc54e26f7a6f175dbff6"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.652412 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.654143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" event={"ID":"911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d","Type":"ContainerStarted","Data":"5e9044dfec0868db729fc1951674f4ab7d4402bd3b2ecec2605d42a0bb60340e"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.654284 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.656351 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" event={"ID":"42afdadf-08ba-4196-a815-a4fc0acf2181","Type":"ContainerStarted","Data":"5d9d22e44a4c391c9cd74ba0aa5f30672b77822953523df58c1de933b6ca0215"} Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.664741 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" podStartSLOduration=7.502929863 podStartE2EDuration="34.66471923s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.007219914 +0000 UTC m=+1019.434470649" lastFinishedPulling="2025-10-03 07:13:33.169009281 +0000 UTC m=+1046.596260016" observedRunningTime="2025-10-03 07:13:38.658592943 +0000 UTC m=+1052.085843688" watchObservedRunningTime="2025-10-03 07:13:38.66471923 +0000 UTC m=+1052.091969965" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.683438 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" podStartSLOduration=8.134282839 podStartE2EDuration="35.68341871s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.619252464 +0000 UTC m=+1019.046503199" lastFinishedPulling="2025-10-03 07:13:33.168388335 +0000 UTC m=+1046.595639070" observedRunningTime="2025-10-03 07:13:38.681709263 +0000 UTC m=+1052.108959998" watchObservedRunningTime="2025-10-03 07:13:38.68341871 +0000 UTC m=+1052.110669455" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.708841 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" podStartSLOduration=9.226744609 podStartE2EDuration="35.708818194s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.686395352 +0000 UTC m=+1020.113646087" lastFinishedPulling="2025-10-03 07:13:33.168468897 +0000 UTC m=+1046.595719672" observedRunningTime="2025-10-03 07:13:38.70136832 +0000 UTC m=+1052.128619055" watchObservedRunningTime="2025-10-03 07:13:38.708818194 +0000 UTC m=+1052.136068929" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.726700 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" podStartSLOduration=9.024465908 podStartE2EDuration="35.726679041s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.361077116 +0000 UTC m=+1018.788327851" lastFinishedPulling="2025-10-03 07:13:32.063290249 +0000 UTC m=+1045.490540984" observedRunningTime="2025-10-03 07:13:38.722079865 +0000 UTC m=+1052.149330620" watchObservedRunningTime="2025-10-03 07:13:38.726679041 +0000 UTC m=+1052.153929796" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.742400 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" podStartSLOduration=7.864489436 podStartE2EDuration="35.74237976s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.856317605 +0000 UTC m=+1019.283568340" lastFinishedPulling="2025-10-03 07:13:33.734207919 +0000 UTC m=+1047.161458664" observedRunningTime="2025-10-03 07:13:38.738423452 +0000 UTC m=+1052.165674187" watchObservedRunningTime="2025-10-03 07:13:38.74237976 +0000 UTC m=+1052.169630515" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.763929 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" podStartSLOduration=6.480820563 podStartE2EDuration="34.763909997s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.912368134 +0000 UTC m=+1019.339618869" lastFinishedPulling="2025-10-03 07:13:34.195457568 +0000 UTC m=+1047.622708303" observedRunningTime="2025-10-03 07:13:38.758758256 +0000 UTC m=+1052.186008991" watchObservedRunningTime="2025-10-03 07:13:38.763909997 +0000 UTC m=+1052.191160732" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.793749 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" podStartSLOduration=6.980675656 podStartE2EDuration="34.793723551s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.92101454 +0000 UTC m=+1019.348265275" lastFinishedPulling="2025-10-03 07:13:33.734062435 +0000 UTC m=+1047.161313170" observedRunningTime="2025-10-03 07:13:38.781276061 +0000 UTC m=+1052.208526796" watchObservedRunningTime="2025-10-03 07:13:38.793723551 +0000 UTC m=+1052.220974296" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.803431 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" podStartSLOduration=7.578453168 podStartE2EDuration="35.803407805s" podCreationTimestamp="2025-10-03 07:13:03 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.510020222 +0000 UTC m=+1018.937270957" lastFinishedPulling="2025-10-03 07:13:33.734974859 +0000 UTC m=+1047.162225594" observedRunningTime="2025-10-03 07:13:38.799454657 +0000 UTC m=+1052.226705402" watchObservedRunningTime="2025-10-03 07:13:38.803407805 +0000 UTC m=+1052.230658540" Oct 03 07:13:38 crc kubenswrapper[4810]: I1003 07:13:38.815099 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" podStartSLOduration=6.698043362 podStartE2EDuration="34.815078844s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.617025783 +0000 UTC m=+1019.044276518" lastFinishedPulling="2025-10-03 07:13:33.734061265 +0000 UTC m=+1047.161312000" observedRunningTime="2025-10-03 07:13:38.813269524 +0000 UTC m=+1052.240520259" watchObservedRunningTime="2025-10-03 07:13:38.815078844 +0000 UTC m=+1052.242329579" Oct 03 07:13:39 crc kubenswrapper[4810]: I1003 07:13:39.666216 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.216170 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8686fd99f7-jlqcq" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.219800 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6d6d64fdcf-5d7tm" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.230789 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-58d86cd59d-nthwx" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.339154 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-586b66cf4f-qxv5t" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.575026 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-d785ddfd5-kln57" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.593615 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5ffbdb7ddf-nf2kl" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.632836 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-66fdd975d9-47p2c" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.701688 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-696ff4bcdd-msm2p" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.726491 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-59b5fc9845-hv9vj" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.759477 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-549fb68678-6qbx2" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.814086 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-855d7949fc-gnvp7" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.815068 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5b45478b88-cj2zz" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.847819 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-b4444585c-9qkqx" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.877577 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-ccbfcb8c-tnrxp" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.901536 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5ffb97cddf-vf8h7" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.989630 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:44 crc kubenswrapper[4810]: I1003 07:13:44.991405 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-76d5577b-7dv7x" Oct 03 07:13:45 crc kubenswrapper[4810]: I1003 07:13:45.018544 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6bb6dcddc-hmtgk" Oct 03 07:13:45 crc kubenswrapper[4810]: I1003 07:13:45.039997 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5595cf6c95-4hcbj" Oct 03 07:13:46 crc kubenswrapper[4810]: I1003 07:13:46.172381 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7c9978f67-wqn2b" Oct 03 07:13:49 crc kubenswrapper[4810]: I1003 07:13:49.306143 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:13:50 crc kubenswrapper[4810]: I1003 07:13:50.772234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" event={"ID":"aaf10555-293e-4c2b-baea-ffcdd4eeb046","Type":"ContainerStarted","Data":"218edc81822485123214796da8dcd3d0e53770cc343833080779a3eb3edbb8cb"} Oct 03 07:13:50 crc kubenswrapper[4810]: I1003 07:13:50.772877 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:50 crc kubenswrapper[4810]: I1003 07:13:50.803602 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" podStartSLOduration=3.132769467 podStartE2EDuration="46.803580758s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:06.900699522 +0000 UTC m=+1020.327950257" lastFinishedPulling="2025-10-03 07:13:50.571510813 +0000 UTC m=+1063.998761548" observedRunningTime="2025-10-03 07:13:50.79671398 +0000 UTC m=+1064.223964715" watchObservedRunningTime="2025-10-03 07:13:50.803580758 +0000 UTC m=+1064.230831493" Oct 03 07:13:56 crc kubenswrapper[4810]: I1003 07:13:56.596936 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48" Oct 03 07:13:56 crc kubenswrapper[4810]: I1003 07:13:56.830207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" event={"ID":"21a68e29-471e-428d-9491-ae4b33a01e8a","Type":"ContainerStarted","Data":"a66aa4bf47787c389f1dbef338dbabaa03a848917e202d5fca4d6291465b6472"} Oct 03 07:13:56 crc kubenswrapper[4810]: I1003 07:13:56.831511 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:14:02 crc kubenswrapper[4810]: I1003 07:14:02.088585 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:14:02 crc kubenswrapper[4810]: I1003 07:14:02.090212 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:14:04 crc kubenswrapper[4810]: I1003 07:14:04.444011 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" Oct 03 07:14:04 crc kubenswrapper[4810]: I1003 07:14:04.473221 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c9969c6c6-zvpws" podStartSLOduration=10.972495036 podStartE2EDuration="1m0.473200738s" podCreationTimestamp="2025-10-03 07:13:04 +0000 UTC" firstStartedPulling="2025-10-03 07:13:05.926086459 +0000 UTC m=+1019.353337194" lastFinishedPulling="2025-10-03 07:13:55.426792151 +0000 UTC m=+1068.854042896" observedRunningTime="2025-10-03 07:13:56.851720535 +0000 UTC m=+1070.278971290" watchObservedRunningTime="2025-10-03 07:14:04.473200738 +0000 UTC m=+1077.900451483" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.652564 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.656467 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.658588 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.658967 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-94gt2" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.659562 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.659619 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.663317 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.696339 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.697635 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.701286 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.719362 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.787572 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.787631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.787814 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.787931 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696fq\" (UniqueName: \"kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.788007 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5glj6\" (UniqueName: \"kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.889373 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.889434 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696fq\" (UniqueName: \"kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.889473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5glj6\" (UniqueName: \"kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.889506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.889534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.890394 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.890431 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.890459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.919170 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696fq\" (UniqueName: \"kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq\") pod \"dnsmasq-dns-6d84845cb9-njnwn\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.919268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5glj6\" (UniqueName: \"kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6\") pod \"dnsmasq-dns-8687b65d7f-pkmvh\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:20 crc kubenswrapper[4810]: I1003 07:14:20.972459 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:21 crc kubenswrapper[4810]: I1003 07:14:21.047077 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:21 crc kubenswrapper[4810]: I1003 07:14:21.462993 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:21 crc kubenswrapper[4810]: I1003 07:14:21.542097 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:21 crc kubenswrapper[4810]: W1003 07:14:21.543763 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod669c7958_a5ed_4f87_bf16_10fb3b8c5c18.slice/crio-78e391fc78f21e7bff472937e0035aeaeddc57f8e6fe6371389d8c79710a5158 WatchSource:0}: Error finding container 78e391fc78f21e7bff472937e0035aeaeddc57f8e6fe6371389d8c79710a5158: Status 404 returned error can't find the container with id 78e391fc78f21e7bff472937e0035aeaeddc57f8e6fe6371389d8c79710a5158 Oct 03 07:14:22 crc kubenswrapper[4810]: I1003 07:14:22.034386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" event={"ID":"a931becd-77ee-4198-9318-f5388578db65","Type":"ContainerStarted","Data":"f2f98670b52254f7740084624308af5a3a13f1ea758c37b89fdaae5d20b8af76"} Oct 03 07:14:22 crc kubenswrapper[4810]: I1003 07:14:22.035601 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" event={"ID":"669c7958-a5ed-4f87-bf16-10fb3b8c5c18","Type":"ContainerStarted","Data":"78e391fc78f21e7bff472937e0035aeaeddc57f8e6fe6371389d8c79710a5158"} Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.452491 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.478624 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.480829 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.501629 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.632026 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.632074 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt247\" (UniqueName: \"kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.632100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.733257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.733317 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt247\" (UniqueName: \"kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.733344 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.734296 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.734423 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.745972 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.773656 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.774877 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.776086 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt247\" (UniqueName: \"kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247\") pod \"dnsmasq-dns-b599c6fc9-ggjqr\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.780120 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.813391 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.937045 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.937120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:23 crc kubenswrapper[4810]: I1003 07:14:23.937408 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmb7t\" (UniqueName: \"kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.038719 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmb7t\" (UniqueName: \"kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.039756 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.039910 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.041651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.042044 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.080314 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmb7t\" (UniqueName: \"kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t\") pod \"dnsmasq-dns-5cb7995759-df88j\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.105076 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.128293 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.627223 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.628362 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638585 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638611 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638653 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638725 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638725 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638786 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.638844 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.639005 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6kffh" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.672772 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:14:24 crc kubenswrapper[4810]: W1003 07:14:24.680621 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod153dac9a_2e18_4482_b260_c5e5e90a578f.slice/crio-83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e WatchSource:0}: Error finding container 83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e: Status 404 returned error can't find the container with id 83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768181 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768223 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768264 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768312 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768348 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768380 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768411 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768447 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.768462 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hj27\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869457 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869538 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869597 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869632 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869663 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869695 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.869712 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj27\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.870049 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.870471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.871230 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.871409 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.871457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.872437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.876819 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.878607 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.878634 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.882442 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.885481 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hj27\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.893283 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.894524 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.897001 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.897174 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-x8mlj" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.897256 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.897410 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.898614 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.899667 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.901357 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.914870 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.930388 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.963092 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971507 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971560 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971598 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971614 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971635 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971710 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971744 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971766 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw5cw\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:24 crc kubenswrapper[4810]: I1003 07:14:24.971801 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.072996 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073045 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073062 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073096 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073139 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073178 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073206 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw5cw\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073261 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.073473 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.076484 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.076562 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.077225 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.077463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.077956 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.080443 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.082122 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.082183 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.083298 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.093270 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw5cw\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.093750 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.119345 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" event={"ID":"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4","Type":"ContainerStarted","Data":"54a27a69c56933940b4016d153278e764cc4df51b4da5779709573dab11d1211"} Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.122697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-df88j" event={"ID":"153dac9a-2e18-4482-b260-c5e5e90a578f","Type":"ContainerStarted","Data":"83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e"} Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.276619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 07:14:25 crc kubenswrapper[4810]: I1003 07:14:25.483221 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.512982 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.524698 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.531776 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.532767 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.533040 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.534115 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.539256 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.539435 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-pwzlz" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.543950 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.545285 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.550212 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.554233 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-k9kv8" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.554727 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.555226 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.561227 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.597821 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624757 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624773 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624817 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624838 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5wd\" (UniqueName: \"kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624858 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624873 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.624923 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725689 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725731 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5wd\" (UniqueName: \"kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725750 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725768 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725796 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725819 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7lr\" (UniqueName: \"kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725845 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725968 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.725989 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726043 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726063 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726116 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726134 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726150 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.726296 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.727300 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.727573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.727822 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.728150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.732061 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.753448 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.753711 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.767959 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5wd\" (UniqueName: \"kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.784199 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827440 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7lr\" (UniqueName: \"kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827488 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827537 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827552 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827570 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.827585 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.830659 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.833782 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.834066 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.834814 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.836238 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.836918 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.838558 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.851117 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.853792 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7lr\" (UniqueName: \"kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.870254 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.887715 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.968685 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.969967 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.973657 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.973962 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-hpvj6" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.974091 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 07:14:27 crc kubenswrapper[4810]: I1003 07:14:27.980192 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.133321 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.133373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.133401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.133727 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvp6v\" (UniqueName: \"kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.133800 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.190281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.235576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.235630 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.235660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.235711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvp6v\" (UniqueName: \"kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.235734 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.236912 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.237620 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.239920 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.241326 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.258216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvp6v\" (UniqueName: \"kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v\") pod \"memcached-0\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: I1003 07:14:28.285325 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 07:14:28 crc kubenswrapper[4810]: W1003 07:14:28.869062 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37cd32da_b730_4e57_a2e0_41bf95ff8ca1.slice/crio-8d08b5b89b59bdbf02c3c7b5d19cb457dcc42de2ffcd1dbfaddd56151c5a6d76 WatchSource:0}: Error finding container 8d08b5b89b59bdbf02c3c7b5d19cb457dcc42de2ffcd1dbfaddd56151c5a6d76: Status 404 returned error can't find the container with id 8d08b5b89b59bdbf02c3c7b5d19cb457dcc42de2ffcd1dbfaddd56151c5a6d76 Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.166560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerStarted","Data":"8d08b5b89b59bdbf02c3c7b5d19cb457dcc42de2ffcd1dbfaddd56151c5a6d76"} Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.627769 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.628796 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.631681 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8vnfl" Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.635603 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.771146 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxj8r\" (UniqueName: \"kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r\") pod \"kube-state-metrics-0\" (UID: \"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c\") " pod="openstack/kube-state-metrics-0" Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.873126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxj8r\" (UniqueName: \"kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r\") pod \"kube-state-metrics-0\" (UID: \"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c\") " pod="openstack/kube-state-metrics-0" Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.896724 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxj8r\" (UniqueName: \"kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r\") pod \"kube-state-metrics-0\" (UID: \"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c\") " pod="openstack/kube-state-metrics-0" Oct 03 07:14:29 crc kubenswrapper[4810]: I1003 07:14:29.944306 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:14:32 crc kubenswrapper[4810]: I1003 07:14:32.089130 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:14:32 crc kubenswrapper[4810]: I1003 07:14:32.089485 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:14:32 crc kubenswrapper[4810]: I1003 07:14:32.089530 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:14:32 crc kubenswrapper[4810]: I1003 07:14:32.090162 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:14:32 crc kubenswrapper[4810]: I1003 07:14:32.090239 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a" gracePeriod=600 Oct 03 07:14:33 crc kubenswrapper[4810]: I1003 07:14:33.197861 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a" exitCode=0 Oct 03 07:14:33 crc kubenswrapper[4810]: I1003 07:14:33.197924 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a"} Oct 03 07:14:33 crc kubenswrapper[4810]: I1003 07:14:33.198210 4810 scope.go:117] "RemoveContainer" containerID="7da1c65a94c15a6a11c717509f6f10fc3e7bd7d70577dc44b1f1dd0e974561a5" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.372577 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.373723 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.376370 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.376593 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-22hxc" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.376633 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.396825 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.398354 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.419236 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.455945 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544653 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544739 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544766 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544788 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544820 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544836 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544876 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz9gv\" (UniqueName: \"kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544925 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwp92\" (UniqueName: \"kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.544991 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.545068 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.545086 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.545110 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.648966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649591 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649882 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649979 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.649996 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650010 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650075 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650163 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz9gv\" (UniqueName: \"kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650225 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwp92\" (UniqueName: \"kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650261 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650323 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650349 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650392 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650455 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.650838 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.651467 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.652209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.652461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.653450 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.658141 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.666125 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.669331 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwp92\" (UniqueName: \"kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92\") pod \"ovn-controller-7rltw\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.672488 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz9gv\" (UniqueName: \"kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv\") pod \"ovn-controller-ovs-s8f2s\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.693269 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw" Oct 03 07:14:34 crc kubenswrapper[4810]: I1003 07:14:34.718184 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.048714 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.050799 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.054316 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.054525 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.054673 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q7xz6" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.054911 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.057268 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.061199 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175133 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175201 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175252 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175348 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175433 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175468 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcnzn\" (UniqueName: \"kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.175491 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276618 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276658 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276740 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276758 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcnzn\" (UniqueName: \"kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276774 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.276806 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.285170 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.287854 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.288453 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.288838 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.289553 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.289685 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.301321 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.309227 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcnzn\" (UniqueName: \"kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.317313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:36 crc kubenswrapper[4810]: I1003 07:14:36.371626 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.335340 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.336840 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.345504 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.345779 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.345985 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qpjtg" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.347694 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.351359 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.496229 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.496754 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d6g2\" (UniqueName: \"kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.496829 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.496880 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.496972 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.497030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.497106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.497145 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599048 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599173 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599212 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d6g2\" (UniqueName: \"kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599279 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599346 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599374 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.599457 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.600355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.601510 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.602915 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.603351 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.606734 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.608114 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.613268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.619573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d6g2\" (UniqueName: \"kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.634269 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:37 crc kubenswrapper[4810]: I1003 07:14:37.662413 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 07:14:40 crc kubenswrapper[4810]: E1003 07:14:40.854003 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 07:14:40 crc kubenswrapper[4810]: E1003 07:14:40.854390 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-696fq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d84845cb9-njnwn_openstack(a931becd-77ee-4198-9318-f5388578db65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:14:40 crc kubenswrapper[4810]: E1003 07:14:40.855552 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" podUID="a931becd-77ee-4198-9318-f5388578db65" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.290993 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" event={"ID":"a931becd-77ee-4198-9318-f5388578db65","Type":"ContainerDied","Data":"f2f98670b52254f7740084624308af5a3a13f1ea758c37b89fdaae5d20b8af76"} Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.291697 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2f98670b52254f7740084624308af5a3a13f1ea758c37b89fdaae5d20b8af76" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.395022 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.408837 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:42 crc kubenswrapper[4810]: W1003 07:14:42.450620 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod064ce5a9_4b27_4851_93ea_aa8c3f038eb8.slice/crio-a94510aec3efc19ce3f067e90ef6e3002b25e5aa00e04d35cecbb4cc79fb849c WatchSource:0}: Error finding container a94510aec3efc19ce3f067e90ef6e3002b25e5aa00e04d35cecbb4cc79fb849c: Status 404 returned error can't find the container with id a94510aec3efc19ce3f067e90ef6e3002b25e5aa00e04d35cecbb4cc79fb849c Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.598980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config\") pod \"a931becd-77ee-4198-9318-f5388578db65\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.599379 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696fq\" (UniqueName: \"kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq\") pod \"a931becd-77ee-4198-9318-f5388578db65\" (UID: \"a931becd-77ee-4198-9318-f5388578db65\") " Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.599821 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config" (OuterVolumeSpecName: "config") pod "a931becd-77ee-4198-9318-f5388578db65" (UID: "a931becd-77ee-4198-9318-f5388578db65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.600042 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a931becd-77ee-4198-9318-f5388578db65-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.605266 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq" (OuterVolumeSpecName: "kube-api-access-696fq") pod "a931becd-77ee-4198-9318-f5388578db65" (UID: "a931becd-77ee-4198-9318-f5388578db65"). InnerVolumeSpecName "kube-api-access-696fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.689473 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.701491 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696fq\" (UniqueName: \"kubernetes.io/projected/a931becd-77ee-4198-9318-f5388578db65-kube-api-access-696fq\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.859697 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:14:42 crc kubenswrapper[4810]: W1003 07:14:42.870559 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d666cf5_d522_45c6_8fbf_13da1289394a.slice/crio-f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1 WatchSource:0}: Error finding container f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1: Status 404 returned error can't find the container with id f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1 Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.964440 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:14:42 crc kubenswrapper[4810]: I1003 07:14:42.968245 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 07:14:42 crc kubenswrapper[4810]: W1003 07:14:42.970557 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fce1d94_bf74_40fe_a1bf_9e03b2ad92e1.slice/crio-a068a4fc698e7be87ad1d45230c27fbc1f04c95a5ddf4a3a59fa3f2e51df7381 WatchSource:0}: Error finding container a068a4fc698e7be87ad1d45230c27fbc1f04c95a5ddf4a3a59fa3f2e51df7381: Status 404 returned error can't find the container with id a068a4fc698e7be87ad1d45230c27fbc1f04c95a5ddf4a3a59fa3f2e51df7381 Oct 03 07:14:42 crc kubenswrapper[4810]: W1003 07:14:42.972111 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99d4cb8b_1033_4fc9_b5dd_ad30b89a754c.slice/crio-3ad66bca48961feb88f68c26578a8f2c2fc44ddc1d6cedf80a471be1d61d8e54 WatchSource:0}: Error finding container 3ad66bca48961feb88f68c26578a8f2c2fc44ddc1d6cedf80a471be1d61d8e54: Status 404 returned error can't find the container with id 3ad66bca48961feb88f68c26578a8f2c2fc44ddc1d6cedf80a471be1d61d8e54 Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.106991 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:14:43 crc kubenswrapper[4810]: W1003 07:14:43.116282 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2722ac3d_f149_4926_9b5e_cb43d477c15a.slice/crio-995fd8cea6e7fd1d4a637b0f857271a5cdacc40d36a7b64e6d825b969aa66fd8 WatchSource:0}: Error finding container 995fd8cea6e7fd1d4a637b0f857271a5cdacc40d36a7b64e6d825b969aa66fd8: Status 404 returned error can't find the container with id 995fd8cea6e7fd1d4a637b0f857271a5cdacc40d36a7b64e6d825b969aa66fd8 Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.128762 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:14:43 crc kubenswrapper[4810]: W1003 07:14:43.146934 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde2b9513_b087_4498_aeef_d912e22091fb.slice/crio-8ba11fa7b0b6e74c92907035953e3b3b230b40f08f4fbeaa771ee5694cb8423f WatchSource:0}: Error finding container 8ba11fa7b0b6e74c92907035953e3b3b230b40f08f4fbeaa771ee5694cb8423f: Status 404 returned error can't find the container with id 8ba11fa7b0b6e74c92907035953e3b3b230b40f08f4fbeaa771ee5694cb8423f Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.242595 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:14:43 crc kubenswrapper[4810]: W1003 07:14:43.249210 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16b4d1d_9bc5_4634_a03e_fd823db95e0d.slice/crio-c8d7d5c5deeb27f1069124a7085ee4ba7de677aa6ec0b1d49baaac5a3d796c56 WatchSource:0}: Error finding container c8d7d5c5deeb27f1069124a7085ee4ba7de677aa6ec0b1d49baaac5a3d796c56: Status 404 returned error can't find the container with id c8d7d5c5deeb27f1069124a7085ee4ba7de677aa6ec0b1d49baaac5a3d796c56 Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.343605 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.343645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerStarted","Data":"f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.346035 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw" event={"ID":"de2b9513-b087-4498-aeef-d912e22091fb","Type":"ContainerStarted","Data":"8ba11fa7b0b6e74c92907035953e3b3b230b40f08f4fbeaa771ee5694cb8423f"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.346077 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.348021 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c","Type":"ContainerStarted","Data":"3ad66bca48961feb88f68c26578a8f2c2fc44ddc1d6cedf80a471be1d61d8e54"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.350706 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerStarted","Data":"a94510aec3efc19ce3f067e90ef6e3002b25e5aa00e04d35cecbb4cc79fb849c"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.352861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerStarted","Data":"995fd8cea6e7fd1d4a637b0f857271a5cdacc40d36a7b64e6d825b969aa66fd8"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.354011 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerStarted","Data":"c8d7d5c5deeb27f1069124a7085ee4ba7de677aa6ec0b1d49baaac5a3d796c56"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.355261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerStarted","Data":"d341764c4d2c9dfccb600b236d913b4e6d6213ab3a29426f4ad741f70beb4a4c"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.356331 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1","Type":"ContainerStarted","Data":"a068a4fc698e7be87ad1d45230c27fbc1f04c95a5ddf4a3a59fa3f2e51df7381"} Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.356371 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d84845cb9-njnwn" Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.404093 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:43 crc kubenswrapper[4810]: I1003 07:14:43.415272 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d84845cb9-njnwn"] Oct 03 07:14:44 crc kubenswrapper[4810]: I1003 07:14:44.370311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerStarted","Data":"c91af75b681d9da00496316aa26097786cb96c87179bd1b1811efde6271d4a3f"} Oct 03 07:14:45 crc kubenswrapper[4810]: I1003 07:14:45.314161 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a931becd-77ee-4198-9318-f5388578db65" path="/var/lib/kubelet/pods/a931becd-77ee-4198-9318-f5388578db65/volumes" Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.029353 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.030220 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pt247,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-b599c6fc9-ggjqr_openstack(110ce6e0-3865-43b7-bf9e-2ad228e6dfc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.031802 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.388155 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5" Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.388599 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:deed73df7ec3db8068a2ded61c540a3fa530863d2c77498014508b022c542db5,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5glj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8687b65d7f-pkmvh_openstack(669c7958-a5ed-4f87-bf16-10fb3b8c5c18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:14:46 crc kubenswrapper[4810]: I1003 07:14:46.389544 4810 generic.go:334] "Generic (PLEG): container finished" podID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerID="3ff73edca1345a254a3086f098fe5ad66a80530ab9ba5dd7d33d3e1dfe7eacf1" exitCode=0 Oct 03 07:14:46 crc kubenswrapper[4810]: E1003 07:14:46.390277 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" podUID="669c7958-a5ed-4f87-bf16-10fb3b8c5c18" Oct 03 07:14:46 crc kubenswrapper[4810]: I1003 07:14:46.390707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-df88j" event={"ID":"153dac9a-2e18-4482-b260-c5e5e90a578f","Type":"ContainerDied","Data":"3ff73edca1345a254a3086f098fe5ad66a80530ab9ba5dd7d33d3e1dfe7eacf1"} Oct 03 07:14:47 crc kubenswrapper[4810]: I1003 07:14:47.421644 4810 generic.go:334] "Generic (PLEG): container finished" podID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerID="4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c" exitCode=0 Oct 03 07:14:47 crc kubenswrapper[4810]: I1003 07:14:47.421715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" event={"ID":"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4","Type":"ContainerDied","Data":"4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c"} Oct 03 07:14:47 crc kubenswrapper[4810]: I1003 07:14:47.427278 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-df88j" event={"ID":"153dac9a-2e18-4482-b260-c5e5e90a578f","Type":"ContainerStarted","Data":"c65d312e3fef0962f5dd898ce43bc0b2227422ec0ef8fbac0355213702bdf2d9"} Oct 03 07:14:47 crc kubenswrapper[4810]: I1003 07:14:47.427414 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:47 crc kubenswrapper[4810]: I1003 07:14:47.591406 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cb7995759-df88j" podStartSLOduration=3.231826592 podStartE2EDuration="24.591377823s" podCreationTimestamp="2025-10-03 07:14:23 +0000 UTC" firstStartedPulling="2025-10-03 07:14:24.683772048 +0000 UTC m=+1098.111022783" lastFinishedPulling="2025-10-03 07:14:46.043323279 +0000 UTC m=+1119.470574014" observedRunningTime="2025-10-03 07:14:47.559609756 +0000 UTC m=+1120.986860491" watchObservedRunningTime="2025-10-03 07:14:47.591377823 +0000 UTC m=+1121.018628558" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.013159 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.113033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config\") pod \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.113092 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc\") pod \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.113180 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5glj6\" (UniqueName: \"kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6\") pod \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\" (UID: \"669c7958-a5ed-4f87-bf16-10fb3b8c5c18\") " Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.113737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "669c7958-a5ed-4f87-bf16-10fb3b8c5c18" (UID: "669c7958-a5ed-4f87-bf16-10fb3b8c5c18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.113758 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config" (OuterVolumeSpecName: "config") pod "669c7958-a5ed-4f87-bf16-10fb3b8c5c18" (UID: "669c7958-a5ed-4f87-bf16-10fb3b8c5c18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.116854 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6" (OuterVolumeSpecName: "kube-api-access-5glj6") pod "669c7958-a5ed-4f87-bf16-10fb3b8c5c18" (UID: "669c7958-a5ed-4f87-bf16-10fb3b8c5c18"). InnerVolumeSpecName "kube-api-access-5glj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.214516 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.214588 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.214599 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5glj6\" (UniqueName: \"kubernetes.io/projected/669c7958-a5ed-4f87-bf16-10fb3b8c5c18-kube-api-access-5glj6\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.456462 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.456652 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8687b65d7f-pkmvh" event={"ID":"669c7958-a5ed-4f87-bf16-10fb3b8c5c18","Type":"ContainerDied","Data":"78e391fc78f21e7bff472937e0035aeaeddc57f8e6fe6371389d8c79710a5158"} Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.531312 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:48 crc kubenswrapper[4810]: I1003 07:14:48.569700 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8687b65d7f-pkmvh"] Oct 03 07:14:49 crc kubenswrapper[4810]: I1003 07:14:49.312239 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669c7958-a5ed-4f87-bf16-10fb3b8c5c18" path="/var/lib/kubelet/pods/669c7958-a5ed-4f87-bf16-10fb3b8c5c18/volumes" Oct 03 07:14:54 crc kubenswrapper[4810]: I1003 07:14:54.107572 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:14:54 crc kubenswrapper[4810]: I1003 07:14:54.180771 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.517560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerStarted","Data":"00634c712dc0748b4e08ff0a87638caa3ab48eec323f9b5a7efe244d9c29c7f0"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.519253 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerStarted","Data":"2b43b8e3fc4f0ec24d5e44665727bb914fd15db10c5890da4c565201c94e81b3"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.521772 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerStarted","Data":"6b1b4cc1d1b966fa22564b66bb53d584c8fb69081697fbded46dedc35eaf64df"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.523023 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1","Type":"ContainerStarted","Data":"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.523734 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.526246 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c","Type":"ContainerStarted","Data":"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.527013 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.529476 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" event={"ID":"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4","Type":"ContainerStarted","Data":"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.529638 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="dnsmasq-dns" containerID="cri-o://c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940" gracePeriod=10 Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.529960 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.532527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerStarted","Data":"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.534186 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw" event={"ID":"de2b9513-b087-4498-aeef-d912e22091fb","Type":"ContainerStarted","Data":"ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.534352 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7rltw" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.535868 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerStarted","Data":"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9"} Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.649159 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" podStartSLOduration=-9223372004.205662 podStartE2EDuration="32.649113081s" podCreationTimestamp="2025-10-03 07:14:23 +0000 UTC" firstStartedPulling="2025-10-03 07:14:24.179272998 +0000 UTC m=+1097.606523733" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:14:55.624087519 +0000 UTC m=+1129.051338264" watchObservedRunningTime="2025-10-03 07:14:55.649113081 +0000 UTC m=+1129.076363836" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.652190 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.887171376 podStartE2EDuration="28.65217374s" podCreationTimestamp="2025-10-03 07:14:27 +0000 UTC" firstStartedPulling="2025-10-03 07:14:42.974326748 +0000 UTC m=+1116.401577483" lastFinishedPulling="2025-10-03 07:14:54.739329112 +0000 UTC m=+1128.166579847" observedRunningTime="2025-10-03 07:14:55.59684376 +0000 UTC m=+1129.024094495" watchObservedRunningTime="2025-10-03 07:14:55.65217374 +0000 UTC m=+1129.079424485" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.684383 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7rltw" podStartSLOduration=10.022337606 podStartE2EDuration="21.684362919s" podCreationTimestamp="2025-10-03 07:14:34 +0000 UTC" firstStartedPulling="2025-10-03 07:14:43.160835226 +0000 UTC m=+1116.588085961" lastFinishedPulling="2025-10-03 07:14:54.822860539 +0000 UTC m=+1128.250111274" observedRunningTime="2025-10-03 07:14:55.677447819 +0000 UTC m=+1129.104698594" watchObservedRunningTime="2025-10-03 07:14:55.684362919 +0000 UTC m=+1129.111613664" Oct 03 07:14:55 crc kubenswrapper[4810]: I1003 07:14:55.697005 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.846767473 podStartE2EDuration="26.696985308s" podCreationTimestamp="2025-10-03 07:14:29 +0000 UTC" firstStartedPulling="2025-10-03 07:14:42.974956784 +0000 UTC m=+1116.402207519" lastFinishedPulling="2025-10-03 07:14:54.825174619 +0000 UTC m=+1128.252425354" observedRunningTime="2025-10-03 07:14:55.693174199 +0000 UTC m=+1129.120424974" watchObservedRunningTime="2025-10-03 07:14:55.696985308 +0000 UTC m=+1129.124236083" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.040994 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.160117 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt247\" (UniqueName: \"kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247\") pod \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.160203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc\") pod \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.160361 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config\") pod \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\" (UID: \"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4\") " Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.166463 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247" (OuterVolumeSpecName: "kube-api-access-pt247") pod "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" (UID: "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4"). InnerVolumeSpecName "kube-api-access-pt247". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.199537 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config" (OuterVolumeSpecName: "config") pod "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" (UID: "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.214326 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" (UID: "110ce6e0-3865-43b7-bf9e-2ad228e6dfc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.261783 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.261820 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt247\" (UniqueName: \"kubernetes.io/projected/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-kube-api-access-pt247\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.261832 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.543563 4810 generic.go:334] "Generic (PLEG): container finished" podID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerID="c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940" exitCode=0 Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.543638 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.543640 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" event={"ID":"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4","Type":"ContainerDied","Data":"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940"} Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.543746 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b599c6fc9-ggjqr" event={"ID":"110ce6e0-3865-43b7-bf9e-2ad228e6dfc4","Type":"ContainerDied","Data":"54a27a69c56933940b4016d153278e764cc4df51b4da5779709573dab11d1211"} Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.543787 4810 scope.go:117] "RemoveContainer" containerID="c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.549234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerStarted","Data":"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3"} Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.570365 4810 scope.go:117] "RemoveContainer" containerID="4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.584704 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.592581 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b599c6fc9-ggjqr"] Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.600035 4810 scope.go:117] "RemoveContainer" containerID="c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940" Oct 03 07:14:56 crc kubenswrapper[4810]: E1003 07:14:56.600552 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940\": container with ID starting with c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940 not found: ID does not exist" containerID="c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.600593 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940"} err="failed to get container status \"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940\": rpc error: code = NotFound desc = could not find container \"c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940\": container with ID starting with c910df66727195ce4739607b303ab0e701ae7a3e476e95525a846a3d8fe03940 not found: ID does not exist" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.600618 4810 scope.go:117] "RemoveContainer" containerID="4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c" Oct 03 07:14:56 crc kubenswrapper[4810]: E1003 07:14:56.601230 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c\": container with ID starting with 4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c not found: ID does not exist" containerID="4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c" Oct 03 07:14:56 crc kubenswrapper[4810]: I1003 07:14:56.601295 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c"} err="failed to get container status \"4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c\": rpc error: code = NotFound desc = could not find container \"4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c\": container with ID starting with 4e999db867de4073de5bb10b9b5ca5a36971951496a3a35f81bb7ed5f3a75a0c not found: ID does not exist" Oct 03 07:14:57 crc kubenswrapper[4810]: I1003 07:14:57.314444 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" path="/var/lib/kubelet/pods/110ce6e0-3865-43b7-bf9e-2ad228e6dfc4/volumes" Oct 03 07:14:57 crc kubenswrapper[4810]: I1003 07:14:57.558930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerStarted","Data":"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0"} Oct 03 07:14:57 crc kubenswrapper[4810]: I1003 07:14:57.562778 4810 generic.go:334] "Generic (PLEG): container finished" podID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerID="192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9" exitCode=0 Oct 03 07:14:57 crc kubenswrapper[4810]: I1003 07:14:57.562851 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerDied","Data":"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9"} Oct 03 07:14:58 crc kubenswrapper[4810]: I1003 07:14:58.577063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerStarted","Data":"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0"} Oct 03 07:14:59 crc kubenswrapper[4810]: I1003 07:14:59.588218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerStarted","Data":"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c"} Oct 03 07:14:59 crc kubenswrapper[4810]: I1003 07:14:59.589293 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:59 crc kubenswrapper[4810]: I1003 07:14:59.589405 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:14:59 crc kubenswrapper[4810]: I1003 07:14:59.621789 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-s8f2s" podStartSLOduration=14.150420497 podStartE2EDuration="25.621773883s" podCreationTimestamp="2025-10-03 07:14:34 +0000 UTC" firstStartedPulling="2025-10-03 07:14:43.346505663 +0000 UTC m=+1116.773756398" lastFinishedPulling="2025-10-03 07:14:54.817859049 +0000 UTC m=+1128.245109784" observedRunningTime="2025-10-03 07:14:59.620685185 +0000 UTC m=+1133.047935950" watchObservedRunningTime="2025-10-03 07:14:59.621773883 +0000 UTC m=+1133.049024618" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.154062 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk"] Oct 03 07:15:00 crc kubenswrapper[4810]: E1003 07:15:00.154792 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="init" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.154816 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="init" Oct 03 07:15:00 crc kubenswrapper[4810]: E1003 07:15:00.154844 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="dnsmasq-dns" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.154854 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="dnsmasq-dns" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.155106 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="110ce6e0-3865-43b7-bf9e-2ad228e6dfc4" containerName="dnsmasq-dns" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.155712 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.157920 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.158789 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.170066 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk"] Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.234619 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v947r\" (UniqueName: \"kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.234793 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.235020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.336865 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v947r\" (UniqueName: \"kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.336967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.337104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.338882 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.341574 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.374668 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v947r\" (UniqueName: \"kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r\") pod \"collect-profiles-29324595-plsxk\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.484944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.595775 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerStarted","Data":"125c3a25a7189d260fedad09c0d614352a064e4e81c52065137ac3f845134c30"} Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.597610 4810 generic.go:334] "Generic (PLEG): container finished" podID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerID="2b43b8e3fc4f0ec24d5e44665727bb914fd15db10c5890da4c565201c94e81b3" exitCode=0 Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.597650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerDied","Data":"2b43b8e3fc4f0ec24d5e44665727bb914fd15db10c5890da4c565201c94e81b3"} Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.602008 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerStarted","Data":"e15533b44b899ab5c22bbd516df9bac595774dd7440a2dc50fd69ae2fb595cdd"} Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.603870 4810 generic.go:334] "Generic (PLEG): container finished" podID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerID="1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7" exitCode=0 Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.604020 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerDied","Data":"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7"} Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.635976 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.206925594 podStartE2EDuration="25.63595868s" podCreationTimestamp="2025-10-03 07:14:35 +0000 UTC" firstStartedPulling="2025-10-03 07:14:43.119343565 +0000 UTC m=+1116.546594300" lastFinishedPulling="2025-10-03 07:14:59.548376651 +0000 UTC m=+1132.975627386" observedRunningTime="2025-10-03 07:15:00.625395345 +0000 UTC m=+1134.052646080" watchObservedRunningTime="2025-10-03 07:15:00.63595868 +0000 UTC m=+1134.063209415" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.692458 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.404668194 podStartE2EDuration="24.692438081s" podCreationTimestamp="2025-10-03 07:14:36 +0000 UTC" firstStartedPulling="2025-10-03 07:14:43.251446116 +0000 UTC m=+1116.678696851" lastFinishedPulling="2025-10-03 07:14:59.539216003 +0000 UTC m=+1132.966466738" observedRunningTime="2025-10-03 07:15:00.689465344 +0000 UTC m=+1134.116716109" watchObservedRunningTime="2025-10-03 07:15:00.692438081 +0000 UTC m=+1134.119688816" Oct 03 07:15:00 crc kubenswrapper[4810]: I1003 07:15:00.914543 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk"] Oct 03 07:15:00 crc kubenswrapper[4810]: W1003 07:15:00.921580 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0450540_6f02_4b2c_9a2b_39d93ddf303d.slice/crio-e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19 WatchSource:0}: Error finding container e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19: Status 404 returned error can't find the container with id e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19 Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.372422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.613460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerStarted","Data":"7b392a0073cffbdd3820c8d247e7f2c85594c4c28bdec1e60745b838785fde2a"} Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.615844 4810 generic.go:334] "Generic (PLEG): container finished" podID="c0450540-6f02-4b2c-9a2b-39d93ddf303d" containerID="71662c3f83c16baacce2daf019b96af84f623461be54e3c6f0821a5d7af83e88" exitCode=0 Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.615954 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" event={"ID":"c0450540-6f02-4b2c-9a2b-39d93ddf303d","Type":"ContainerDied","Data":"71662c3f83c16baacce2daf019b96af84f623461be54e3c6f0821a5d7af83e88"} Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.616214 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" event={"ID":"c0450540-6f02-4b2c-9a2b-39d93ddf303d","Type":"ContainerStarted","Data":"e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19"} Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.619106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerStarted","Data":"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9"} Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.663175 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.682330 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=23.712705561 podStartE2EDuration="35.682314576s" podCreationTimestamp="2025-10-03 07:14:26 +0000 UTC" firstStartedPulling="2025-10-03 07:14:42.873961923 +0000 UTC m=+1116.301212658" lastFinishedPulling="2025-10-03 07:14:54.843570938 +0000 UTC m=+1128.270821673" observedRunningTime="2025-10-03 07:15:01.644703436 +0000 UTC m=+1135.071954201" watchObservedRunningTime="2025-10-03 07:15:01.682314576 +0000 UTC m=+1135.109565301" Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.683856 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.342391445 podStartE2EDuration="35.683849845s" podCreationTimestamp="2025-10-03 07:14:26 +0000 UTC" firstStartedPulling="2025-10-03 07:14:42.453667696 +0000 UTC m=+1115.880918431" lastFinishedPulling="2025-10-03 07:14:54.795126096 +0000 UTC m=+1128.222376831" observedRunningTime="2025-10-03 07:15:01.676278439 +0000 UTC m=+1135.103529174" watchObservedRunningTime="2025-10-03 07:15:01.683849845 +0000 UTC m=+1135.111100580" Oct 03 07:15:01 crc kubenswrapper[4810]: I1003 07:15:01.713704 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.630641 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.693408 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.949511 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.972143 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:02 crc kubenswrapper[4810]: E1003 07:15:02.972557 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0450540-6f02-4b2c-9a2b-39d93ddf303d" containerName="collect-profiles" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.972580 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0450540-6f02-4b2c-9a2b-39d93ddf303d" containerName="collect-profiles" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.972798 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0450540-6f02-4b2c-9a2b-39d93ddf303d" containerName="collect-profiles" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.976131 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.978457 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.995304 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v947r\" (UniqueName: \"kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r\") pod \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.995360 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume\") pod \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.995506 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume\") pod \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\" (UID: \"c0450540-6f02-4b2c-9a2b-39d93ddf303d\") " Oct 03 07:15:02 crc kubenswrapper[4810]: I1003 07:15:02.996450 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume" (OuterVolumeSpecName: "config-volume") pod "c0450540-6f02-4b2c-9a2b-39d93ddf303d" (UID: "c0450540-6f02-4b2c-9a2b-39d93ddf303d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:02.997734 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.015145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c0450540-6f02-4b2c-9a2b-39d93ddf303d" (UID: "c0450540-6f02-4b2c-9a2b-39d93ddf303d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.015155 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r" (OuterVolumeSpecName: "kube-api-access-v947r") pod "c0450540-6f02-4b2c-9a2b-39d93ddf303d" (UID: "c0450540-6f02-4b2c-9a2b-39d93ddf303d"). InnerVolumeSpecName "kube-api-access-v947r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.020849 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.022281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.024389 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.062947 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097081 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097131 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097175 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097571 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdd5f\" (UniqueName: \"kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097649 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097799 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097822 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l87fg\" (UniqueName: \"kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.097996 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v947r\" (UniqueName: \"kubernetes.io/projected/c0450540-6f02-4b2c-9a2b-39d93ddf303d-kube-api-access-v947r\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.098008 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c0450540-6f02-4b2c-9a2b-39d93ddf303d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.098019 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c0450540-6f02-4b2c-9a2b-39d93ddf303d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199496 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdd5f\" (UniqueName: \"kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199582 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199624 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199642 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199656 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l87fg\" (UniqueName: \"kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199684 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199705 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199720 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.199738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.200266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.200285 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.200512 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.200811 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.201084 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.201199 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.204103 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.204161 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.216630 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l87fg\" (UniqueName: \"kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg\") pod \"ovn-controller-metrics-hdwhb\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.217074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdd5f\" (UniqueName: \"kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f\") pod \"dnsmasq-dns-7cfdb68775-f6k7j\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.287102 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.296682 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.375164 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.375477 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.376007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.387717 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.389285 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.397760 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.402059 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.444767 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.504331 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.504759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.504805 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.504828 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.504928 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5257v\" (UniqueName: \"kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.606503 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.606566 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.606594 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.606662 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5257v\" (UniqueName: \"kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.606733 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.607836 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.612678 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.612681 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.617626 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.642815 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5257v\" (UniqueName: \"kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v\") pod \"dnsmasq-dns-75898fdcf9-mfw5l\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.655576 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.656086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk" event={"ID":"c0450540-6f02-4b2c-9a2b-39d93ddf303d","Type":"ContainerDied","Data":"e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19"} Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.656107 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e49b98754d3ea41d216a3f21f4eee8460ab2ba6b6846e51118b3a816ef5c6e19" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.741204 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.755448 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.763609 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.807332 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.917754 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.919450 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.922144 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.922311 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.922418 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.922534 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r4pcr" Oct 03 07:15:03 crc kubenswrapper[4810]: I1003 07:15:03.952175 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.017748 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.017821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.017871 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.017891 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.017994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.018029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.018240 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86j2q\" (UniqueName: \"kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.124716 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.125727 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.125765 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.125798 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.125819 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.126121 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86j2q\" (UniqueName: \"kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.126309 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.126816 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.127091 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.127740 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.129833 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.130313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.132444 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.144985 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86j2q\" (UniqueName: \"kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q\") pod \"ovn-northd-0\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.234248 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:04 crc kubenswrapper[4810]: W1003 07:15:04.235327 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f799b4b_5c3a_424f_aef2_909012dbffae.slice/crio-226af8248d0dffd20c022d4afcd990aecc66e3f92a1076256ae3ff4eef4ef334 WatchSource:0}: Error finding container 226af8248d0dffd20c022d4afcd990aecc66e3f92a1076256ae3ff4eef4ef334: Status 404 returned error can't find the container with id 226af8248d0dffd20c022d4afcd990aecc66e3f92a1076256ae3ff4eef4ef334 Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.332842 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.669517 4810 generic.go:334] "Generic (PLEG): container finished" podID="25d58ceb-1171-451d-b471-819b327718d3" containerID="13cb8d9ece5903102ad221a2ad755ac153ba7e8a6109bb3d98810cb51f66e179" exitCode=0 Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.670090 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" event={"ID":"25d58ceb-1171-451d-b471-819b327718d3","Type":"ContainerDied","Data":"13cb8d9ece5903102ad221a2ad755ac153ba7e8a6109bb3d98810cb51f66e179"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.670138 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" event={"ID":"25d58ceb-1171-451d-b471-819b327718d3","Type":"ContainerStarted","Data":"f63fb7ebe41c82bc14b519978c8968a549b04c09b051f1b5707015f2e9689f2d"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.674573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdwhb" event={"ID":"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46","Type":"ContainerStarted","Data":"cf1efc6c98e7db501c387b01da937aadb58092bae8f0df2fc0d91ec5c743ef15"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.674641 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdwhb" event={"ID":"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46","Type":"ContainerStarted","Data":"385f4829872c9beb88e76f53ff9a4cc709241bebe66e13d57e9f76e1286944ee"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.677659 4810 generic.go:334] "Generic (PLEG): container finished" podID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerID="e174fabbfce4f659886c65261e825bd590a17ea71a91901e81f1d5065b6cb3b6" exitCode=0 Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.677769 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" event={"ID":"3f799b4b-5c3a-424f-aef2-909012dbffae","Type":"ContainerDied","Data":"e174fabbfce4f659886c65261e825bd590a17ea71a91901e81f1d5065b6cb3b6"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.677819 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" event={"ID":"3f799b4b-5c3a-424f-aef2-909012dbffae","Type":"ContainerStarted","Data":"226af8248d0dffd20c022d4afcd990aecc66e3f92a1076256ae3ff4eef4ef334"} Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.730380 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-hdwhb" podStartSLOduration=2.730351481 podStartE2EDuration="2.730351481s" podCreationTimestamp="2025-10-03 07:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:15:04.718140073 +0000 UTC m=+1138.145390808" watchObservedRunningTime="2025-10-03 07:15:04.730351481 +0000 UTC m=+1138.157602226" Oct 03 07:15:04 crc kubenswrapper[4810]: I1003 07:15:04.806376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.055440 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.145238 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb\") pod \"25d58ceb-1171-451d-b471-819b327718d3\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.145426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config\") pod \"25d58ceb-1171-451d-b471-819b327718d3\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.145455 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc\") pod \"25d58ceb-1171-451d-b471-819b327718d3\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.145528 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdd5f\" (UniqueName: \"kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f\") pod \"25d58ceb-1171-451d-b471-819b327718d3\" (UID: \"25d58ceb-1171-451d-b471-819b327718d3\") " Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.153797 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f" (OuterVolumeSpecName: "kube-api-access-cdd5f") pod "25d58ceb-1171-451d-b471-819b327718d3" (UID: "25d58ceb-1171-451d-b471-819b327718d3"). InnerVolumeSpecName "kube-api-access-cdd5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.175559 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25d58ceb-1171-451d-b471-819b327718d3" (UID: "25d58ceb-1171-451d-b471-819b327718d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.176973 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25d58ceb-1171-451d-b471-819b327718d3" (UID: "25d58ceb-1171-451d-b471-819b327718d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.182700 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config" (OuterVolumeSpecName: "config") pod "25d58ceb-1171-451d-b471-819b327718d3" (UID: "25d58ceb-1171-451d-b471-819b327718d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.248493 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.248551 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.248570 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25d58ceb-1171-451d-b471-819b327718d3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.248586 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdd5f\" (UniqueName: \"kubernetes.io/projected/25d58ceb-1171-451d-b471-819b327718d3-kube-api-access-cdd5f\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.690929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" event={"ID":"25d58ceb-1171-451d-b471-819b327718d3","Type":"ContainerDied","Data":"f63fb7ebe41c82bc14b519978c8968a549b04c09b051f1b5707015f2e9689f2d"} Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.693028 4810 scope.go:117] "RemoveContainer" containerID="13cb8d9ece5903102ad221a2ad755ac153ba7e8a6109bb3d98810cb51f66e179" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.690966 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cfdb68775-f6k7j" Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.699614 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" event={"ID":"3f799b4b-5c3a-424f-aef2-909012dbffae","Type":"ContainerStarted","Data":"f3a01bac5fd93b8e2130488945f04b2f61b855c73ca65160e7f179a06e8d6638"} Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.714824 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerStarted","Data":"c6e3acd5a0de307723b03b1a4f782dddf205210f8198e24dfb72b544bee75f21"} Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.769741 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:05 crc kubenswrapper[4810]: I1003 07:15:05.776584 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cfdb68775-f6k7j"] Oct 03 07:15:06 crc kubenswrapper[4810]: I1003 07:15:06.737144 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:06 crc kubenswrapper[4810]: I1003 07:15:06.777964 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" podStartSLOduration=3.777929537 podStartE2EDuration="3.777929537s" podCreationTimestamp="2025-10-03 07:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:15:06.764140017 +0000 UTC m=+1140.191390772" watchObservedRunningTime="2025-10-03 07:15:06.777929537 +0000 UTC m=+1140.205180312" Oct 03 07:15:07 crc kubenswrapper[4810]: I1003 07:15:07.324251 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d58ceb-1171-451d-b471-819b327718d3" path="/var/lib/kubelet/pods/25d58ceb-1171-451d-b471-819b327718d3/volumes" Oct 03 07:15:07 crc kubenswrapper[4810]: I1003 07:15:07.870921 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 07:15:07 crc kubenswrapper[4810]: I1003 07:15:07.872049 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 07:15:08 crc kubenswrapper[4810]: I1003 07:15:08.192015 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 07:15:08 crc kubenswrapper[4810]: I1003 07:15:08.192378 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 07:15:09 crc kubenswrapper[4810]: I1003 07:15:09.952480 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.042283 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.042879 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="dnsmasq-dns" containerID="cri-o://f3a01bac5fd93b8e2130488945f04b2f61b855c73ca65160e7f179a06e8d6638" gracePeriod=10 Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.044070 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.091862 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:15:10 crc kubenswrapper[4810]: E1003 07:15:10.097300 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d58ceb-1171-451d-b471-819b327718d3" containerName="init" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.097333 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d58ceb-1171-451d-b471-819b327718d3" containerName="init" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.097537 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d58ceb-1171-451d-b471-819b327718d3" containerName="init" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.098390 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.119675 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.245886 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.246401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6f94\" (UniqueName: \"kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.246487 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.246525 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.246663 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.348300 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6f94\" (UniqueName: \"kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.348368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.348416 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.348456 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.348505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.349411 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.349424 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.349573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.349632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.367839 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6f94\" (UniqueName: \"kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94\") pod \"dnsmasq-dns-69d4d5cdc5-649t6\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.420778 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.777708 4810 generic.go:334] "Generic (PLEG): container finished" podID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerID="f3a01bac5fd93b8e2130488945f04b2f61b855c73ca65160e7f179a06e8d6638" exitCode=0 Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.777801 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" event={"ID":"3f799b4b-5c3a-424f-aef2-909012dbffae","Type":"ContainerDied","Data":"f3a01bac5fd93b8e2130488945f04b2f61b855c73ca65160e7f179a06e8d6638"} Oct 03 07:15:10 crc kubenswrapper[4810]: I1003 07:15:10.858183 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:15:10 crc kubenswrapper[4810]: W1003 07:15:10.859497 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87e1b144_75c6_4ccc_b31b_e6774e470913.slice/crio-3b36d24fd67bfb7a108d4946e63dbeb2fad5e0370a03b3fc1ce55fe0a9900bf3 WatchSource:0}: Error finding container 3b36d24fd67bfb7a108d4946e63dbeb2fad5e0370a03b3fc1ce55fe0a9900bf3: Status 404 returned error can't find the container with id 3b36d24fd67bfb7a108d4946e63dbeb2fad5e0370a03b3fc1ce55fe0a9900bf3 Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.050302 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.106598 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="galera" probeResult="failure" output=< Oct 03 07:15:11 crc kubenswrapper[4810]: wsrep_local_state_comment (Joined) differs from Synced Oct 03 07:15:11 crc kubenswrapper[4810]: > Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.141225 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.184728 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.185021 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.187887 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4b4ts" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.188070 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.188257 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.188374 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.277029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.277365 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.277682 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.277924 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbdm\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.278243 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.380251 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.381077 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.381013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.381279 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.381575 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.381686 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift podName:3bc41979-74c8-4736-bdb0-bb6c66837ad2 nodeName:}" failed. No retries permitted until 2025-10-03 07:15:11.881668985 +0000 UTC m=+1145.308919720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift") pod "swift-storage-0" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2") : configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.381795 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.381984 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.382159 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.382167 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbdm\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.382238 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.404234 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.404303 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbdm\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.670572 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-c27mw"] Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.672187 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.673716 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.674204 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.680322 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.683474 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c27mw"] Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.763941 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792447 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792472 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792529 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x74nb\" (UniqueName: \"kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.792805 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.793234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" event={"ID":"3f799b4b-5c3a-424f-aef2-909012dbffae","Type":"ContainerDied","Data":"226af8248d0dffd20c022d4afcd990aecc66e3f92a1076256ae3ff4eef4ef334"} Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.793252 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75898fdcf9-mfw5l" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.793303 4810 scope.go:117] "RemoveContainer" containerID="f3a01bac5fd93b8e2130488945f04b2f61b855c73ca65160e7f179a06e8d6638" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.795259 4810 generic.go:334] "Generic (PLEG): container finished" podID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerID="ee351051df1f3457a08e6b5a900ece536b6e9bd569a7765104faa50c35d383fe" exitCode=0 Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.795332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" event={"ID":"87e1b144-75c6-4ccc-b31b-e6774e470913","Type":"ContainerDied","Data":"ee351051df1f3457a08e6b5a900ece536b6e9bd569a7765104faa50c35d383fe"} Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.799113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" event={"ID":"87e1b144-75c6-4ccc-b31b-e6774e470913","Type":"ContainerStarted","Data":"3b36d24fd67bfb7a108d4946e63dbeb2fad5e0370a03b3fc1ce55fe0a9900bf3"} Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.823738 4810 scope.go:117] "RemoveContainer" containerID="e174fabbfce4f659886c65261e825bd590a17ea71a91901e81f1d5065b6cb3b6" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.893665 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb\") pod \"3f799b4b-5c3a-424f-aef2-909012dbffae\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.893720 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb\") pod \"3f799b4b-5c3a-424f-aef2-909012dbffae\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.893833 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc\") pod \"3f799b4b-5c3a-424f-aef2-909012dbffae\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.893954 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config\") pod \"3f799b4b-5c3a-424f-aef2-909012dbffae\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.893996 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5257v\" (UniqueName: \"kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v\") pod \"3f799b4b-5c3a-424f-aef2-909012dbffae\" (UID: \"3f799b4b-5c3a-424f-aef2-909012dbffae\") " Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x74nb\" (UniqueName: \"kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894414 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894450 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894551 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894572 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.894629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.894949 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.894968 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: E1003 07:15:11.895015 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift podName:3bc41979-74c8-4736-bdb0-bb6c66837ad2 nodeName:}" failed. No retries permitted until 2025-10-03 07:15:12.894998366 +0000 UTC m=+1146.322249111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift") pod "swift-storage-0" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2") : configmap "swift-ring-files" not found Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.895270 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.895519 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.895573 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.898743 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v" (OuterVolumeSpecName: "kube-api-access-5257v") pod "3f799b4b-5c3a-424f-aef2-909012dbffae" (UID: "3f799b4b-5c3a-424f-aef2-909012dbffae"). InnerVolumeSpecName "kube-api-access-5257v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.900386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.902854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.903238 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.908769 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x74nb\" (UniqueName: \"kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb\") pod \"swift-ring-rebalance-c27mw\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.940351 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3f799b4b-5c3a-424f-aef2-909012dbffae" (UID: "3f799b4b-5c3a-424f-aef2-909012dbffae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.942472 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3f799b4b-5c3a-424f-aef2-909012dbffae" (UID: "3f799b4b-5c3a-424f-aef2-909012dbffae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.943926 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config" (OuterVolumeSpecName: "config") pod "3f799b4b-5c3a-424f-aef2-909012dbffae" (UID: "3f799b4b-5c3a-424f-aef2-909012dbffae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.950296 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3f799b4b-5c3a-424f-aef2-909012dbffae" (UID: "3f799b4b-5c3a-424f-aef2-909012dbffae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.996546 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.996577 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5257v\" (UniqueName: \"kubernetes.io/projected/3f799b4b-5c3a-424f-aef2-909012dbffae-kube-api-access-5257v\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.996589 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.996601 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:11 crc kubenswrapper[4810]: I1003 07:15:11.996630 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3f799b4b-5c3a-424f-aef2-909012dbffae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.061216 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.169938 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.182673 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75898fdcf9-mfw5l"] Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.647158 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c27mw"] Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.806918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c27mw" event={"ID":"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139","Type":"ContainerStarted","Data":"cdd7149a6c2a99f9da27d0ea9572e11eda8a1099340a65a21888ada80ab08e3d"} Oct 03 07:15:12 crc kubenswrapper[4810]: I1003 07:15:12.919571 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:12 crc kubenswrapper[4810]: E1003 07:15:12.919787 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 07:15:12 crc kubenswrapper[4810]: E1003 07:15:12.919809 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 07:15:12 crc kubenswrapper[4810]: E1003 07:15:12.919869 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift podName:3bc41979-74c8-4736-bdb0-bb6c66837ad2 nodeName:}" failed. No retries permitted until 2025-10-03 07:15:14.919851481 +0000 UTC m=+1148.347102226 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift") pod "swift-storage-0" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2") : configmap "swift-ring-files" not found Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.318531 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" path="/var/lib/kubelet/pods/3f799b4b-5c3a-424f-aef2-909012dbffae/volumes" Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.825292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerStarted","Data":"ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7"} Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.831996 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" event={"ID":"87e1b144-75c6-4ccc-b31b-e6774e470913","Type":"ContainerStarted","Data":"11335dcef00d17dd7b3e49d15f49dab20101994b28fccfcb5df8c98a29d836ff"} Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.832349 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.851959 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" podStartSLOduration=3.85193934 podStartE2EDuration="3.85193934s" podCreationTimestamp="2025-10-03 07:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:15:13.847976647 +0000 UTC m=+1147.275227382" watchObservedRunningTime="2025-10-03 07:15:13.85193934 +0000 UTC m=+1147.279190075" Oct 03 07:15:13 crc kubenswrapper[4810]: I1003 07:15:13.988420 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 07:15:14 crc kubenswrapper[4810]: I1003 07:15:14.043572 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="galera" probeResult="failure" output=< Oct 03 07:15:14 crc kubenswrapper[4810]: wsrep_local_state_comment (Joined) differs from Synced Oct 03 07:15:14 crc kubenswrapper[4810]: > Oct 03 07:15:14 crc kubenswrapper[4810]: I1003 07:15:14.842019 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerStarted","Data":"1fc2cb239867d8112861fe2339c64ebf43664be1f37eb5d32adf5e8d9a6244a0"} Oct 03 07:15:14 crc kubenswrapper[4810]: I1003 07:15:14.873690 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.931505675 podStartE2EDuration="11.873024628s" podCreationTimestamp="2025-10-03 07:15:03 +0000 UTC" firstStartedPulling="2025-10-03 07:15:04.833999941 +0000 UTC m=+1138.261250676" lastFinishedPulling="2025-10-03 07:15:11.775518894 +0000 UTC m=+1145.202769629" observedRunningTime="2025-10-03 07:15:14.86736238 +0000 UTC m=+1148.294613125" watchObservedRunningTime="2025-10-03 07:15:14.873024628 +0000 UTC m=+1148.300275363" Oct 03 07:15:14 crc kubenswrapper[4810]: I1003 07:15:14.973770 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:14 crc kubenswrapper[4810]: E1003 07:15:14.974026 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 07:15:14 crc kubenswrapper[4810]: E1003 07:15:14.974068 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 07:15:14 crc kubenswrapper[4810]: E1003 07:15:14.974142 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift podName:3bc41979-74c8-4736-bdb0-bb6c66837ad2 nodeName:}" failed. No retries permitted until 2025-10-03 07:15:18.974113 +0000 UTC m=+1152.401363945 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift") pod "swift-storage-0" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2") : configmap "swift-ring-files" not found Oct 03 07:15:15 crc kubenswrapper[4810]: I1003 07:15:15.855423 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 07:15:16 crc kubenswrapper[4810]: I1003 07:15:16.862882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c27mw" event={"ID":"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139","Type":"ContainerStarted","Data":"a03e4c85b6ff7858d2a856ad993d26b4ff37bf9230182e4686673134b00ae28d"} Oct 03 07:15:16 crc kubenswrapper[4810]: I1003 07:15:16.888061 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-c27mw" podStartSLOduration=2.025502083 podStartE2EDuration="5.888041765s" podCreationTimestamp="2025-10-03 07:15:11 +0000 UTC" firstStartedPulling="2025-10-03 07:15:12.652040545 +0000 UTC m=+1146.079291280" lastFinishedPulling="2025-10-03 07:15:16.514580217 +0000 UTC m=+1149.941830962" observedRunningTime="2025-10-03 07:15:16.882220123 +0000 UTC m=+1150.309470868" watchObservedRunningTime="2025-10-03 07:15:16.888041765 +0000 UTC m=+1150.315292510" Oct 03 07:15:17 crc kubenswrapper[4810]: I1003 07:15:17.939260 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.238376 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.331307 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rtx56"] Oct 03 07:15:18 crc kubenswrapper[4810]: E1003 07:15:18.331720 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="init" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.331742 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="init" Oct 03 07:15:18 crc kubenswrapper[4810]: E1003 07:15:18.331775 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="dnsmasq-dns" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.331784 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="dnsmasq-dns" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.332009 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f799b4b-5c3a-424f-aef2-909012dbffae" containerName="dnsmasq-dns" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.332739 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtx56" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.345372 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rtx56"] Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.431437 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q8kl\" (UniqueName: \"kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl\") pod \"placement-db-create-rtx56\" (UID: \"e40b3794-f1d5-4eab-ae27-c23421053e40\") " pod="openstack/placement-db-create-rtx56" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.533407 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q8kl\" (UniqueName: \"kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl\") pod \"placement-db-create-rtx56\" (UID: \"e40b3794-f1d5-4eab-ae27-c23421053e40\") " pod="openstack/placement-db-create-rtx56" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.570142 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q8kl\" (UniqueName: \"kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl\") pod \"placement-db-create-rtx56\" (UID: \"e40b3794-f1d5-4eab-ae27-c23421053e40\") " pod="openstack/placement-db-create-rtx56" Oct 03 07:15:18 crc kubenswrapper[4810]: I1003 07:15:18.648036 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtx56" Oct 03 07:15:19 crc kubenswrapper[4810]: I1003 07:15:19.042605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:19 crc kubenswrapper[4810]: E1003 07:15:19.042832 4810 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 07:15:19 crc kubenswrapper[4810]: E1003 07:15:19.043347 4810 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 07:15:19 crc kubenswrapper[4810]: E1003 07:15:19.043399 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift podName:3bc41979-74c8-4736-bdb0-bb6c66837ad2 nodeName:}" failed. No retries permitted until 2025-10-03 07:15:27.043383438 +0000 UTC m=+1160.470634173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift") pod "swift-storage-0" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2") : configmap "swift-ring-files" not found Oct 03 07:15:19 crc kubenswrapper[4810]: I1003 07:15:19.116379 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rtx56"] Oct 03 07:15:19 crc kubenswrapper[4810]: W1003 07:15:19.120201 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode40b3794_f1d5_4eab_ae27_c23421053e40.slice/crio-6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147 WatchSource:0}: Error finding container 6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147: Status 404 returned error can't find the container with id 6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147 Oct 03 07:15:19 crc kubenswrapper[4810]: I1003 07:15:19.907270 4810 generic.go:334] "Generic (PLEG): container finished" podID="e40b3794-f1d5-4eab-ae27-c23421053e40" containerID="ff727ce345769006de0cc33823d09d7e241aca4424130baa33108f866e5cb72c" exitCode=0 Oct 03 07:15:19 crc kubenswrapper[4810]: I1003 07:15:19.907364 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtx56" event={"ID":"e40b3794-f1d5-4eab-ae27-c23421053e40","Type":"ContainerDied","Data":"ff727ce345769006de0cc33823d09d7e241aca4424130baa33108f866e5cb72c"} Oct 03 07:15:19 crc kubenswrapper[4810]: I1003 07:15:19.907718 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtx56" event={"ID":"e40b3794-f1d5-4eab-ae27-c23421053e40","Type":"ContainerStarted","Data":"6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147"} Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.422118 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.501739 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.502127 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cb7995759-df88j" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="dnsmasq-dns" containerID="cri-o://c65d312e3fef0962f5dd898ce43bc0b2227422ec0ef8fbac0355213702bdf2d9" gracePeriod=10 Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.916167 4810 generic.go:334] "Generic (PLEG): container finished" podID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerID="c65d312e3fef0962f5dd898ce43bc0b2227422ec0ef8fbac0355213702bdf2d9" exitCode=0 Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.916563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-df88j" event={"ID":"153dac9a-2e18-4482-b260-c5e5e90a578f","Type":"ContainerDied","Data":"c65d312e3fef0962f5dd898ce43bc0b2227422ec0ef8fbac0355213702bdf2d9"} Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.916608 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cb7995759-df88j" event={"ID":"153dac9a-2e18-4482-b260-c5e5e90a578f","Type":"ContainerDied","Data":"83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e"} Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.916622 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a3aaee47d0b34c70a992d764dd3877fa1b9628c23004072a4035845cb6445e" Oct 03 07:15:20 crc kubenswrapper[4810]: I1003 07:15:20.955724 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.083562 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmb7t\" (UniqueName: \"kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t\") pod \"153dac9a-2e18-4482-b260-c5e5e90a578f\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.083717 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc\") pod \"153dac9a-2e18-4482-b260-c5e5e90a578f\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.083782 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config\") pod \"153dac9a-2e18-4482-b260-c5e5e90a578f\" (UID: \"153dac9a-2e18-4482-b260-c5e5e90a578f\") " Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.104329 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t" (OuterVolumeSpecName: "kube-api-access-pmb7t") pod "153dac9a-2e18-4482-b260-c5e5e90a578f" (UID: "153dac9a-2e18-4482-b260-c5e5e90a578f"). InnerVolumeSpecName "kube-api-access-pmb7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.141248 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "153dac9a-2e18-4482-b260-c5e5e90a578f" (UID: "153dac9a-2e18-4482-b260-c5e5e90a578f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.152016 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config" (OuterVolumeSpecName: "config") pod "153dac9a-2e18-4482-b260-c5e5e90a578f" (UID: "153dac9a-2e18-4482-b260-c5e5e90a578f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.191829 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.191855 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmb7t\" (UniqueName: \"kubernetes.io/projected/153dac9a-2e18-4482-b260-c5e5e90a578f-kube-api-access-pmb7t\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.191883 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/153dac9a-2e18-4482-b260-c5e5e90a578f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.197736 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtx56" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.293353 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q8kl\" (UniqueName: \"kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl\") pod \"e40b3794-f1d5-4eab-ae27-c23421053e40\" (UID: \"e40b3794-f1d5-4eab-ae27-c23421053e40\") " Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.296380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl" (OuterVolumeSpecName: "kube-api-access-8q8kl") pod "e40b3794-f1d5-4eab-ae27-c23421053e40" (UID: "e40b3794-f1d5-4eab-ae27-c23421053e40"). InnerVolumeSpecName "kube-api-access-8q8kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.396582 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q8kl\" (UniqueName: \"kubernetes.io/projected/e40b3794-f1d5-4eab-ae27-c23421053e40-kube-api-access-8q8kl\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.926228 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cb7995759-df88j" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.926232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rtx56" event={"ID":"e40b3794-f1d5-4eab-ae27-c23421053e40","Type":"ContainerDied","Data":"6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147"} Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.926798 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c569f82ad418ed3096986740f0c1a7b91096582c6487d46791027e07da18147" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.926250 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rtx56" Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.950511 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:15:21 crc kubenswrapper[4810]: I1003 07:15:21.958921 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cb7995759-df88j"] Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.311798 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" path="/var/lib/kubelet/pods/153dac9a-2e18-4482-b260-c5e5e90a578f/volumes" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.481968 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hg4bz"] Oct 03 07:15:23 crc kubenswrapper[4810]: E1003 07:15:23.482435 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="dnsmasq-dns" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.482462 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="dnsmasq-dns" Oct 03 07:15:23 crc kubenswrapper[4810]: E1003 07:15:23.482523 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e40b3794-f1d5-4eab-ae27-c23421053e40" containerName="mariadb-database-create" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.482537 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e40b3794-f1d5-4eab-ae27-c23421053e40" containerName="mariadb-database-create" Oct 03 07:15:23 crc kubenswrapper[4810]: E1003 07:15:23.482556 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="init" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.482569 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="init" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.482866 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="153dac9a-2e18-4482-b260-c5e5e90a578f" containerName="dnsmasq-dns" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.482884 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e40b3794-f1d5-4eab-ae27-c23421053e40" containerName="mariadb-database-create" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.483878 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.495372 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hg4bz"] Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.533272 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlc6b\" (UniqueName: \"kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b\") pod \"glance-db-create-hg4bz\" (UID: \"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab\") " pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.634961 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlc6b\" (UniqueName: \"kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b\") pod \"glance-db-create-hg4bz\" (UID: \"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab\") " pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.658875 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlc6b\" (UniqueName: \"kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b\") pod \"glance-db-create-hg4bz\" (UID: \"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab\") " pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.804985 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.961743 4810 generic.go:334] "Generic (PLEG): container finished" podID="eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" containerID="a03e4c85b6ff7858d2a856ad993d26b4ff37bf9230182e4686673134b00ae28d" exitCode=0 Oct 03 07:15:23 crc kubenswrapper[4810]: I1003 07:15:23.961881 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c27mw" event={"ID":"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139","Type":"ContainerDied","Data":"a03e4c85b6ff7858d2a856ad993d26b4ff37bf9230182e4686673134b00ae28d"} Oct 03 07:15:24 crc kubenswrapper[4810]: I1003 07:15:24.289964 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hg4bz"] Oct 03 07:15:24 crc kubenswrapper[4810]: W1003 07:15:24.298629 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd0e820_b591_4d7d_94d4_ccf6ae82ddab.slice/crio-202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c WatchSource:0}: Error finding container 202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c: Status 404 returned error can't find the container with id 202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c Oct 03 07:15:24 crc kubenswrapper[4810]: I1003 07:15:24.386058 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 07:15:24 crc kubenswrapper[4810]: I1003 07:15:24.970753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hg4bz" event={"ID":"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab","Type":"ContainerStarted","Data":"202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c"} Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.350570 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370132 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370288 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370376 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370655 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370745 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.370812 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x74nb\" (UniqueName: \"kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb\") pod \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\" (UID: \"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139\") " Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.371282 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.371561 4810 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.371664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.376768 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb" (OuterVolumeSpecName: "kube-api-access-x74nb") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "kube-api-access-x74nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.391727 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.400713 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.402690 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.414389 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts" (OuterVolumeSpecName: "scripts") pod "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" (UID: "eb3dbca9-c4b9-4f3a-b47a-03a50b9af139"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473597 4810 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473637 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473655 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473666 4810 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473679 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x74nb\" (UniqueName: \"kubernetes.io/projected/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-kube-api-access-x74nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.473692 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.983233 4810 generic.go:334] "Generic (PLEG): container finished" podID="dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" containerID="a912efb34fb2f84872b11c81fd5c95edc0960044fe3ae8e5a4a1b417cd38a487" exitCode=0 Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.983301 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hg4bz" event={"ID":"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab","Type":"ContainerDied","Data":"a912efb34fb2f84872b11c81fd5c95edc0960044fe3ae8e5a4a1b417cd38a487"} Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.985491 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c27mw" Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.985401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c27mw" event={"ID":"eb3dbca9-c4b9-4f3a-b47a-03a50b9af139","Type":"ContainerDied","Data":"cdd7149a6c2a99f9da27d0ea9572e11eda8a1099340a65a21888ada80ab08e3d"} Oct 03 07:15:25 crc kubenswrapper[4810]: I1003 07:15:25.985959 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdd7149a6c2a99f9da27d0ea9572e11eda8a1099340a65a21888ada80ab08e3d" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.105517 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.115801 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"swift-storage-0\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " pod="openstack/swift-storage-0" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.121874 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.312801 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.409716 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlc6b\" (UniqueName: \"kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b\") pod \"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab\" (UID: \"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab\") " Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.414545 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b" (OuterVolumeSpecName: "kube-api-access-rlc6b") pod "dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" (UID: "dcd0e820-b591-4d7d-94d4-ccf6ae82ddab"). InnerVolumeSpecName "kube-api-access-rlc6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.512450 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlc6b\" (UniqueName: \"kubernetes.io/projected/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab-kube-api-access-rlc6b\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.635709 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:15:27 crc kubenswrapper[4810]: W1003 07:15:27.637820 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc41979_74c8_4736_bdb0_bb6c66837ad2.slice/crio-3c7bab0219b1d7efab0e67757983842a7cfffd5a6a394d9479ccce8a61dd6084 WatchSource:0}: Error finding container 3c7bab0219b1d7efab0e67757983842a7cfffd5a6a394d9479ccce8a61dd6084: Status 404 returned error can't find the container with id 3c7bab0219b1d7efab0e67757983842a7cfffd5a6a394d9479ccce8a61dd6084 Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.909984 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mnstl"] Oct 03 07:15:27 crc kubenswrapper[4810]: E1003 07:15:27.910546 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" containerName="mariadb-database-create" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.910645 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" containerName="mariadb-database-create" Oct 03 07:15:27 crc kubenswrapper[4810]: E1003 07:15:27.910751 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" containerName="swift-ring-rebalance" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.912301 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" containerName="swift-ring-rebalance" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.912778 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" containerName="mariadb-database-create" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.912869 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" containerName="swift-ring-rebalance" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.913621 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:27 crc kubenswrapper[4810]: I1003 07:15:27.928215 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mnstl"] Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.000199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hg4bz" event={"ID":"dcd0e820-b591-4d7d-94d4-ccf6ae82ddab","Type":"ContainerDied","Data":"202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c"} Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.000260 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="202bee013c9f8a02ced4f76e51f60034d03213a234b83f8ab95161a53d29759c" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.000220 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hg4bz" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.001371 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"3c7bab0219b1d7efab0e67757983842a7cfffd5a6a394d9479ccce8a61dd6084"} Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.022037 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hs7\" (UniqueName: \"kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7\") pod \"keystone-db-create-mnstl\" (UID: \"48838182-7963-4d2a-a108-13e9a5c7a111\") " pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.123335 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hs7\" (UniqueName: \"kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7\") pod \"keystone-db-create-mnstl\" (UID: \"48838182-7963-4d2a-a108-13e9a5c7a111\") " pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.151229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hs7\" (UniqueName: \"kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7\") pod \"keystone-db-create-mnstl\" (UID: \"48838182-7963-4d2a-a108-13e9a5c7a111\") " pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.239077 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.419221 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c289-account-create-pwwl5"] Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.420828 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.423127 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.431323 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c289-account-create-pwwl5"] Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.539780 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcmb6\" (UniqueName: \"kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6\") pod \"placement-c289-account-create-pwwl5\" (UID: \"10621e58-9aee-4bfc-a0ad-6e4857d48a41\") " pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.641752 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcmb6\" (UniqueName: \"kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6\") pod \"placement-c289-account-create-pwwl5\" (UID: \"10621e58-9aee-4bfc-a0ad-6e4857d48a41\") " pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.657009 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mnstl"] Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.660692 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcmb6\" (UniqueName: \"kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6\") pod \"placement-c289-account-create-pwwl5\" (UID: \"10621e58-9aee-4bfc-a0ad-6e4857d48a41\") " pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:28 crc kubenswrapper[4810]: I1003 07:15:28.747096 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.018189 4810 generic.go:334] "Generic (PLEG): container finished" podID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerID="b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0" exitCode=0 Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.018328 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerDied","Data":"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0"} Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.021782 4810 generic.go:334] "Generic (PLEG): container finished" podID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerID="6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3" exitCode=0 Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.021835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerDied","Data":"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3"} Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.752990 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7rltw" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" probeResult="failure" output=< Oct 03 07:15:29 crc kubenswrapper[4810]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 07:15:29 crc kubenswrapper[4810]: > Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.782998 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:15:29 crc kubenswrapper[4810]: I1003 07:15:29.788321 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.022491 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7rltw-config-8b5nc"] Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.023755 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.025459 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.038793 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7rltw-config-8b5nc"] Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.050361 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerStarted","Data":"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f"} Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.050860 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.057150 4810 generic.go:334] "Generic (PLEG): container finished" podID="48838182-7963-4d2a-a108-13e9a5c7a111" containerID="efbce464b43ea569647c2f211d82d262654767de06209c1d333f8956f0cfde74" exitCode=0 Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.057234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnstl" event={"ID":"48838182-7963-4d2a-a108-13e9a5c7a111","Type":"ContainerDied","Data":"efbce464b43ea569647c2f211d82d262654767de06209c1d333f8956f0cfde74"} Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.057269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnstl" event={"ID":"48838182-7963-4d2a-a108-13e9a5c7a111","Type":"ContainerStarted","Data":"768f6535b4594194866d45cd52c72874746d490bddf59d318166dde731ab8e15"} Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.066585 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerStarted","Data":"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec"} Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.067473 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.074170 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"bccbd5c0aebf7a8d7a2c209dbc4d28096af4feec63ee3acb6fb99edbed1f7128"} Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.098332 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.989113417 podStartE2EDuration="1m7.098314427s" podCreationTimestamp="2025-10-03 07:14:23 +0000 UTC" firstStartedPulling="2025-10-03 07:14:42.699640803 +0000 UTC m=+1116.126891538" lastFinishedPulling="2025-10-03 07:14:54.808841813 +0000 UTC m=+1128.236092548" observedRunningTime="2025-10-03 07:15:30.075206975 +0000 UTC m=+1163.502457710" watchObservedRunningTime="2025-10-03 07:15:30.098314427 +0000 UTC m=+1163.525565162" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.126119 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.272260945 podStartE2EDuration="1m7.126101941s" podCreationTimestamp="2025-10-03 07:14:23 +0000 UTC" firstStartedPulling="2025-10-03 07:14:28.872329658 +0000 UTC m=+1102.299580393" lastFinishedPulling="2025-10-03 07:14:47.726170654 +0000 UTC m=+1121.153421389" observedRunningTime="2025-10-03 07:15:30.120439523 +0000 UTC m=+1163.547690278" watchObservedRunningTime="2025-10-03 07:15:30.126101941 +0000 UTC m=+1163.553352676" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.147536 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c289-account-create-pwwl5"] Oct 03 07:15:30 crc kubenswrapper[4810]: W1003 07:15:30.160242 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10621e58_9aee_4bfc_a0ad_6e4857d48a41.slice/crio-0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7 WatchSource:0}: Error finding container 0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7: Status 404 returned error can't find the container with id 0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7 Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.169918 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.170049 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.170115 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.170130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.170170 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6sm\" (UniqueName: \"kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.170238 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271363 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271420 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271470 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6sm\" (UniqueName: \"kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271559 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271794 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271794 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.271814 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.272396 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.273807 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.303020 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6sm\" (UniqueName: \"kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm\") pod \"ovn-controller-7rltw-config-8b5nc\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.385579 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:30 crc kubenswrapper[4810]: I1003 07:15:30.852824 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7rltw-config-8b5nc"] Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.086931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw-config-8b5nc" event={"ID":"dc07ce3e-3d21-4be1-83fa-f0bad78299a9","Type":"ContainerStarted","Data":"5b7b962d4e8c6ecf24b0a12a3a4fa32feb6af6d047ab10fb72675198ba6c703f"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.090044 4810 generic.go:334] "Generic (PLEG): container finished" podID="10621e58-9aee-4bfc-a0ad-6e4857d48a41" containerID="da7ab208b7a7f6acd54276737cbd12b568557be50e4c4853ebfbdce1cc820332" exitCode=0 Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.090088 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c289-account-create-pwwl5" event={"ID":"10621e58-9aee-4bfc-a0ad-6e4857d48a41","Type":"ContainerDied","Data":"da7ab208b7a7f6acd54276737cbd12b568557be50e4c4853ebfbdce1cc820332"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.090389 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c289-account-create-pwwl5" event={"ID":"10621e58-9aee-4bfc-a0ad-6e4857d48a41","Type":"ContainerStarted","Data":"0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.094125 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"9974961e6abf7febdcf8e34a0c97b87742b29d09441f5766a4da92668b493130"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.094354 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"0f9c3077ad8cecdd534fca11fbffb1bf485c180ae8d7129b25caffd483249975"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.094474 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"a6d2ec4008f34f574e9c7b7b8a4964291b10701cd744120f6b95298162e1741c"} Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.622617 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.694266 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5hs7\" (UniqueName: \"kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7\") pod \"48838182-7963-4d2a-a108-13e9a5c7a111\" (UID: \"48838182-7963-4d2a-a108-13e9a5c7a111\") " Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.700416 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7" (OuterVolumeSpecName: "kube-api-access-n5hs7") pod "48838182-7963-4d2a-a108-13e9a5c7a111" (UID: "48838182-7963-4d2a-a108-13e9a5c7a111"). InnerVolumeSpecName "kube-api-access-n5hs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:31 crc kubenswrapper[4810]: I1003 07:15:31.799724 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5hs7\" (UniqueName: \"kubernetes.io/projected/48838182-7963-4d2a-a108-13e9a5c7a111-kube-api-access-n5hs7\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.104620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mnstl" event={"ID":"48838182-7963-4d2a-a108-13e9a5c7a111","Type":"ContainerDied","Data":"768f6535b4594194866d45cd52c72874746d490bddf59d318166dde731ab8e15"} Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.105042 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="768f6535b4594194866d45cd52c72874746d490bddf59d318166dde731ab8e15" Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.105006 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mnstl" Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.109044 4810 generic.go:334] "Generic (PLEG): container finished" podID="dc07ce3e-3d21-4be1-83fa-f0bad78299a9" containerID="3d014cea719ad2754563bf708ca5edd4cb57cd05cdc5d9f1b0fe47be4fe251f0" exitCode=0 Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.109116 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw-config-8b5nc" event={"ID":"dc07ce3e-3d21-4be1-83fa-f0bad78299a9","Type":"ContainerDied","Data":"3d014cea719ad2754563bf708ca5edd4cb57cd05cdc5d9f1b0fe47be4fe251f0"} Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.113379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"45e3ddc52af834f603045aa999d0eb049c646473758731b30f7e8e2af55782fa"} Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.113479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"2b0ce3947967fa96d846a63bd3163c855b442489cfeea6683ecb041c0f9cf8f1"} Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.426822 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.518286 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcmb6\" (UniqueName: \"kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6\") pod \"10621e58-9aee-4bfc-a0ad-6e4857d48a41\" (UID: \"10621e58-9aee-4bfc-a0ad-6e4857d48a41\") " Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.539249 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6" (OuterVolumeSpecName: "kube-api-access-qcmb6") pod "10621e58-9aee-4bfc-a0ad-6e4857d48a41" (UID: "10621e58-9aee-4bfc-a0ad-6e4857d48a41"). InnerVolumeSpecName "kube-api-access-qcmb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:32 crc kubenswrapper[4810]: I1003 07:15:32.620616 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcmb6\" (UniqueName: \"kubernetes.io/projected/10621e58-9aee-4bfc-a0ad-6e4857d48a41-kube-api-access-qcmb6\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.129983 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c289-account-create-pwwl5" event={"ID":"10621e58-9aee-4bfc-a0ad-6e4857d48a41","Type":"ContainerDied","Data":"0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7"} Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.130554 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a7a9f881f5b64f8d1316d336dade8d63d196de02de6b4380798ccb86f8a36b7" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.130056 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c289-account-create-pwwl5" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.137066 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"46fe542f82adb9dd0f09edd022b0f4c932309d5e5f689981a10664ea462e5666"} Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.137113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"260956333a59a33c2b6828c4e5152fe36a01a3f751272837729f275e4afc7b35"} Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.544302 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2b97-account-create-cbnjn"] Oct 03 07:15:33 crc kubenswrapper[4810]: E1003 07:15:33.544682 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48838182-7963-4d2a-a108-13e9a5c7a111" containerName="mariadb-database-create" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.544703 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="48838182-7963-4d2a-a108-13e9a5c7a111" containerName="mariadb-database-create" Oct 03 07:15:33 crc kubenswrapper[4810]: E1003 07:15:33.544729 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10621e58-9aee-4bfc-a0ad-6e4857d48a41" containerName="mariadb-account-create" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.544738 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="10621e58-9aee-4bfc-a0ad-6e4857d48a41" containerName="mariadb-account-create" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.544980 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="48838182-7963-4d2a-a108-13e9a5c7a111" containerName="mariadb-database-create" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.545015 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="10621e58-9aee-4bfc-a0ad-6e4857d48a41" containerName="mariadb-account-create" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.545629 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.549203 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.553541 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2b97-account-create-cbnjn"] Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.636716 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp7dv\" (UniqueName: \"kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv\") pod \"glance-2b97-account-create-cbnjn\" (UID: \"0edd30f8-0847-4961-8f40-399967851ebe\") " pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.708518 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.742117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp7dv\" (UniqueName: \"kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv\") pod \"glance-2b97-account-create-cbnjn\" (UID: \"0edd30f8-0847-4961-8f40-399967851ebe\") " pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.766780 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp7dv\" (UniqueName: \"kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv\") pod \"glance-2b97-account-create-cbnjn\" (UID: \"0edd30f8-0847-4961-8f40-399967851ebe\") " pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843113 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843423 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843475 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc6sm\" (UniqueName: \"kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843510 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843502 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run" (OuterVolumeSpecName: "var-run") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843541 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843567 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.843654 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn\") pod \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\" (UID: \"dc07ce3e-3d21-4be1-83fa-f0bad78299a9\") " Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.844043 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.844423 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.844711 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts" (OuterVolumeSpecName: "scripts") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.845145 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.845164 4810 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.845174 4810 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.845186 4810 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.845195 4810 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.847646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm" (OuterVolumeSpecName: "kube-api-access-kc6sm") pod "dc07ce3e-3d21-4be1-83fa-f0bad78299a9" (UID: "dc07ce3e-3d21-4be1-83fa-f0bad78299a9"). InnerVolumeSpecName "kube-api-access-kc6sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.865959 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:33 crc kubenswrapper[4810]: I1003 07:15:33.947132 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc6sm\" (UniqueName: \"kubernetes.io/projected/dc07ce3e-3d21-4be1-83fa-f0bad78299a9-kube-api-access-kc6sm\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.147059 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw-config-8b5nc" event={"ID":"dc07ce3e-3d21-4be1-83fa-f0bad78299a9","Type":"ContainerDied","Data":"5b7b962d4e8c6ecf24b0a12a3a4fa32feb6af6d047ab10fb72675198ba6c703f"} Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.147102 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b7b962d4e8c6ecf24b0a12a3a4fa32feb6af6d047ab10fb72675198ba6c703f" Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.147153 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw-config-8b5nc" Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.319249 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2b97-account-create-cbnjn"] Oct 03 07:15:34 crc kubenswrapper[4810]: W1003 07:15:34.329950 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edd30f8_0847_4961_8f40_399967851ebe.slice/crio-8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512 WatchSource:0}: Error finding container 8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512: Status 404 returned error can't find the container with id 8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512 Oct 03 07:15:34 crc kubenswrapper[4810]: E1003 07:15:34.695412 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edd30f8_0847_4961_8f40_399967851ebe.slice/crio-conmon-b5e2a6f32a511cfe5c2bcb58d96e9477a878fec9efb648f6d20cbea45fa1123b.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.771663 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7rltw" Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.853120 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7rltw-config-8b5nc"] Oct 03 07:15:34 crc kubenswrapper[4810]: I1003 07:15:34.879403 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7rltw-config-8b5nc"] Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"889f53c414198ba8568ddac1401c1afddd20f9d138e22cef33957e8a0c27a046"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160141 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"895aa7e546ccbeb235b2c62051d1f3cb4581f5dca2fdbb37fa958319aba74571"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160163 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"888e7fe71210eba8464268fac89d6874d77e318e83fa8a722297b8a5dbe5627c"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"ac5b608d8a7aa92bd915544af8c042f3b63ff7151a33ff4dbb17a4a861e889b3"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160181 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"2fea3519604050a7d09779c938e0622ac0f41e5a7bf1dd301b0b7a60c049d624"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.160190 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"ccf54b0b656baac5b5fa788f8029adbf50760366a62fbd65bb43a3456b3066a8"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.161820 4810 generic.go:334] "Generic (PLEG): container finished" podID="0edd30f8-0847-4961-8f40-399967851ebe" containerID="b5e2a6f32a511cfe5c2bcb58d96e9477a878fec9efb648f6d20cbea45fa1123b" exitCode=0 Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.161885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2b97-account-create-cbnjn" event={"ID":"0edd30f8-0847-4961-8f40-399967851ebe","Type":"ContainerDied","Data":"b5e2a6f32a511cfe5c2bcb58d96e9477a878fec9efb648f6d20cbea45fa1123b"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.161963 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2b97-account-create-cbnjn" event={"ID":"0edd30f8-0847-4961-8f40-399967851ebe","Type":"ContainerStarted","Data":"8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512"} Oct 03 07:15:35 crc kubenswrapper[4810]: I1003 07:15:35.320957 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc07ce3e-3d21-4be1-83fa-f0bad78299a9" path="/var/lib/kubelet/pods/dc07ce3e-3d21-4be1-83fa-f0bad78299a9/volumes" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.175201 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerStarted","Data":"1dc52a1d91d58d42a133fd8634f92be4368fdda5c3e244244cb985f1a0e500b4"} Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.235948 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.185044747 podStartE2EDuration="26.235931751s" podCreationTimestamp="2025-10-03 07:15:10 +0000 UTC" firstStartedPulling="2025-10-03 07:15:27.640253619 +0000 UTC m=+1161.067504354" lastFinishedPulling="2025-10-03 07:15:33.691140623 +0000 UTC m=+1167.118391358" observedRunningTime="2025-10-03 07:15:36.234041702 +0000 UTC m=+1169.661292437" watchObservedRunningTime="2025-10-03 07:15:36.235931751 +0000 UTC m=+1169.663182486" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.499406 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:15:36 crc kubenswrapper[4810]: E1003 07:15:36.500520 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc07ce3e-3d21-4be1-83fa-f0bad78299a9" containerName="ovn-config" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.500545 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc07ce3e-3d21-4be1-83fa-f0bad78299a9" containerName="ovn-config" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.500815 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc07ce3e-3d21-4be1-83fa-f0bad78299a9" containerName="ovn-config" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.502064 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.505279 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.554007 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.557482 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.592391 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl7kl\" (UniqueName: \"kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.592616 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.592790 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.592942 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.592997 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.593242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694388 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp7dv\" (UniqueName: \"kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv\") pod \"0edd30f8-0847-4961-8f40-399967851ebe\" (UID: \"0edd30f8-0847-4961-8f40-399967851ebe\") " Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694632 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694708 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl7kl\" (UniqueName: \"kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694745 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694774 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694800 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.694822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.695595 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.695713 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.695905 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.696117 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.696433 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.712816 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv" (OuterVolumeSpecName: "kube-api-access-vp7dv") pod "0edd30f8-0847-4961-8f40-399967851ebe" (UID: "0edd30f8-0847-4961-8f40-399967851ebe"). InnerVolumeSpecName "kube-api-access-vp7dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.717223 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl7kl\" (UniqueName: \"kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl\") pod \"dnsmasq-dns-7987bc7949-2cbfl\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.797017 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp7dv\" (UniqueName: \"kubernetes.io/projected/0edd30f8-0847-4961-8f40-399967851ebe-kube-api-access-vp7dv\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:36 crc kubenswrapper[4810]: I1003 07:15:36.872116 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.184232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2b97-account-create-cbnjn" event={"ID":"0edd30f8-0847-4961-8f40-399967851ebe","Type":"ContainerDied","Data":"8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512"} Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.184611 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b355a3031c9533cd6c87ca25beedd77f113b31ff2024290ca8a6de31ac52512" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.184282 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2b97-account-create-cbnjn" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.344311 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:15:37 crc kubenswrapper[4810]: W1003 07:15:37.347379 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89d07cf8_f28d_4137_8eb6_cdf20040b346.slice/crio-06dcd3221d4c0ea357b6aeeb5175ccd79df323e9b6946da4558a5e869c7ca52b WatchSource:0}: Error finding container 06dcd3221d4c0ea357b6aeeb5175ccd79df323e9b6946da4558a5e869c7ca52b: Status 404 returned error can't find the container with id 06dcd3221d4c0ea357b6aeeb5175ccd79df323e9b6946da4558a5e869c7ca52b Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.949864 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f3ef-account-create-mwtr7"] Oct 03 07:15:37 crc kubenswrapper[4810]: E1003 07:15:37.950603 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edd30f8-0847-4961-8f40-399967851ebe" containerName="mariadb-account-create" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.950622 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edd30f8-0847-4961-8f40-399967851ebe" containerName="mariadb-account-create" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.951556 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edd30f8-0847-4961-8f40-399967851ebe" containerName="mariadb-account-create" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.952157 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.957958 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 07:15:37 crc kubenswrapper[4810]: I1003 07:15:37.965800 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f3ef-account-create-mwtr7"] Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.018941 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp98w\" (UniqueName: \"kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w\") pod \"keystone-f3ef-account-create-mwtr7\" (UID: \"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6\") " pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.121078 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp98w\" (UniqueName: \"kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w\") pod \"keystone-f3ef-account-create-mwtr7\" (UID: \"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6\") " pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.144276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp98w\" (UniqueName: \"kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w\") pod \"keystone-f3ef-account-create-mwtr7\" (UID: \"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6\") " pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.199168 4810 generic.go:334] "Generic (PLEG): container finished" podID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerID="c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776" exitCode=0 Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.199247 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" event={"ID":"89d07cf8-f28d-4137-8eb6-cdf20040b346","Type":"ContainerDied","Data":"c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776"} Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.199297 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" event={"ID":"89d07cf8-f28d-4137-8eb6-cdf20040b346","Type":"ContainerStarted","Data":"06dcd3221d4c0ea357b6aeeb5175ccd79df323e9b6946da4558a5e869c7ca52b"} Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.278630 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.716273 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f3ef-account-create-mwtr7"] Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.776419 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xk8xp"] Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.778200 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.780807 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ktz64" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.782146 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.784687 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xk8xp"] Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.835169 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.835540 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.835721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.835771 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mrc\" (UniqueName: \"kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.937380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.937445 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mrc\" (UniqueName: \"kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.937473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.937494 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.943554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.946345 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.946805 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:38 crc kubenswrapper[4810]: I1003 07:15:38.959566 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mrc\" (UniqueName: \"kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc\") pod \"glance-db-sync-xk8xp\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.193271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xk8xp" Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.233359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" event={"ID":"89d07cf8-f28d-4137-8eb6-cdf20040b346","Type":"ContainerStarted","Data":"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e"} Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.234518 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.256187 4810 generic.go:334] "Generic (PLEG): container finished" podID="9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" containerID="ba8c8fe21e1637c7d147097344118435d1e9acd34eb2f690dcbaedd538f93903" exitCode=0 Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.256243 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f3ef-account-create-mwtr7" event={"ID":"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6","Type":"ContainerDied","Data":"ba8c8fe21e1637c7d147097344118435d1e9acd34eb2f690dcbaedd538f93903"} Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.256276 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f3ef-account-create-mwtr7" event={"ID":"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6","Type":"ContainerStarted","Data":"76c142245a0be082321c4d7c8c7e2b51dbc5e57c24a5a9aee73b94224c56e552"} Oct 03 07:15:39 crc kubenswrapper[4810]: I1003 07:15:39.265994 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" podStartSLOduration=3.2659790969999998 podStartE2EDuration="3.265979097s" podCreationTimestamp="2025-10-03 07:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:15:39.263346069 +0000 UTC m=+1172.690596804" watchObservedRunningTime="2025-10-03 07:15:39.265979097 +0000 UTC m=+1172.693229832" Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:39.816330 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xk8xp"] Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:40.266145 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xk8xp" event={"ID":"adfe75ef-eb52-4d1d-8164-716ea23ead8d","Type":"ContainerStarted","Data":"5e1d96a0ee045a06924b111fd595352b979eb7819ac78ec11fd496b3133e3388"} Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:40.570169 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:40.668604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp98w\" (UniqueName: \"kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w\") pod \"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6\" (UID: \"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6\") " Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:40.675884 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w" (OuterVolumeSpecName: "kube-api-access-sp98w") pod "9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" (UID: "9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6"). InnerVolumeSpecName "kube-api-access-sp98w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:40 crc kubenswrapper[4810]: I1003 07:15:40.771162 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp98w\" (UniqueName: \"kubernetes.io/projected/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6-kube-api-access-sp98w\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:41 crc kubenswrapper[4810]: I1003 07:15:41.274631 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f3ef-account-create-mwtr7" Oct 03 07:15:41 crc kubenswrapper[4810]: I1003 07:15:41.274628 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f3ef-account-create-mwtr7" event={"ID":"9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6","Type":"ContainerDied","Data":"76c142245a0be082321c4d7c8c7e2b51dbc5e57c24a5a9aee73b94224c56e552"} Oct 03 07:15:41 crc kubenswrapper[4810]: I1003 07:15:41.274849 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c142245a0be082321c4d7c8c7e2b51dbc5e57c24a5a9aee73b94224c56e552" Oct 03 07:15:44 crc kubenswrapper[4810]: I1003 07:15:44.966122 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:15:45 crc kubenswrapper[4810]: I1003 07:15:45.281853 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.707844 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kj8vc"] Oct 03 07:15:46 crc kubenswrapper[4810]: E1003 07:15:46.708404 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" containerName="mariadb-account-create" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.708416 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" containerName="mariadb-account-create" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.708560 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" containerName="mariadb-account-create" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.709200 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kj8vc" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.728761 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kj8vc"] Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.812040 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-94grw"] Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.813246 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-94grw" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.821186 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-94grw"] Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.862963 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrwv7\" (UniqueName: \"kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7\") pod \"cinder-db-create-kj8vc\" (UID: \"81605a74-99a2-4c21-8bac-8c4b64c2164f\") " pod="openstack/cinder-db-create-kj8vc" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.863090 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7cbg\" (UniqueName: \"kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg\") pod \"barbican-db-create-94grw\" (UID: \"4620f70b-3da3-4486-a8d7-1c784246d780\") " pod="openstack/barbican-db-create-94grw" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.875104 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.937229 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.942801 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" containerID="cri-o://11335dcef00d17dd7b3e49d15f49dab20101994b28fccfcb5df8c98a29d836ff" gracePeriod=10 Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.965161 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrwv7\" (UniqueName: \"kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7\") pod \"cinder-db-create-kj8vc\" (UID: \"81605a74-99a2-4c21-8bac-8c4b64c2164f\") " pod="openstack/cinder-db-create-kj8vc" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.965293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7cbg\" (UniqueName: \"kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg\") pod \"barbican-db-create-94grw\" (UID: \"4620f70b-3da3-4486-a8d7-1c784246d780\") " pod="openstack/barbican-db-create-94grw" Oct 03 07:15:46 crc kubenswrapper[4810]: I1003 07:15:46.983913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7cbg\" (UniqueName: \"kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg\") pod \"barbican-db-create-94grw\" (UID: \"4620f70b-3da3-4486-a8d7-1c784246d780\") " pod="openstack/barbican-db-create-94grw" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.009169 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrwv7\" (UniqueName: \"kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7\") pod \"cinder-db-create-kj8vc\" (UID: \"81605a74-99a2-4c21-8bac-8c4b64c2164f\") " pod="openstack/cinder-db-create-kj8vc" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.018208 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-59s9b"] Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.019614 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.023759 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.024041 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.024158 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.024260 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkbxn" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.028752 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kj8vc" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.039843 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2llk2"] Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.041045 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2llk2" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.061619 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-59s9b"] Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.068283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb6wd\" (UniqueName: \"kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd\") pod \"neutron-db-create-2llk2\" (UID: \"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27\") " pod="openstack/neutron-db-create-2llk2" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.068369 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.068398 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.068434 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px7qg\" (UniqueName: \"kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.078967 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2llk2"] Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.137684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-94grw" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.170463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb6wd\" (UniqueName: \"kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd\") pod \"neutron-db-create-2llk2\" (UID: \"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27\") " pod="openstack/neutron-db-create-2llk2" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.170553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.170582 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.170618 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px7qg\" (UniqueName: \"kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.174110 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.183354 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.193759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px7qg\" (UniqueName: \"kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg\") pod \"keystone-db-sync-59s9b\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.198493 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb6wd\" (UniqueName: \"kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd\") pod \"neutron-db-create-2llk2\" (UID: \"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27\") " pod="openstack/neutron-db-create-2llk2" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.346503 4810 generic.go:334] "Generic (PLEG): container finished" podID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerID="11335dcef00d17dd7b3e49d15f49dab20101994b28fccfcb5df8c98a29d836ff" exitCode=0 Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.346573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" event={"ID":"87e1b144-75c6-4ccc-b31b-e6774e470913","Type":"ContainerDied","Data":"11335dcef00d17dd7b3e49d15f49dab20101994b28fccfcb5df8c98a29d836ff"} Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.381461 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-59s9b" Oct 03 07:15:47 crc kubenswrapper[4810]: I1003 07:15:47.389349 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2llk2" Oct 03 07:15:50 crc kubenswrapper[4810]: I1003 07:15:50.422048 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Oct 03 07:15:55 crc kubenswrapper[4810]: I1003 07:15:55.421371 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.118:5353: connect: connection refused" Oct 03 07:15:58 crc kubenswrapper[4810]: E1003 07:15:58.238022 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24" Oct 03 07:15:58 crc kubenswrapper[4810]: E1003 07:15:58.238668 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4mrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-xk8xp_openstack(adfe75ef-eb52-4d1d-8164-716ea23ead8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:15:58 crc kubenswrapper[4810]: E1003 07:15:58.239969 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-xk8xp" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" Oct 03 07:15:58 crc kubenswrapper[4810]: E1003 07:15:58.460524 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:2069e730d5ced0e278392077ad261a3c35bf5df1d88735441859f23e8e3ceb24\\\"\"" pod="openstack/glance-db-sync-xk8xp" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.587529 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.696585 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb\") pod \"87e1b144-75c6-4ccc-b31b-e6774e470913\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.696622 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config\") pod \"87e1b144-75c6-4ccc-b31b-e6774e470913\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.696793 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc\") pod \"87e1b144-75c6-4ccc-b31b-e6774e470913\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.696811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb\") pod \"87e1b144-75c6-4ccc-b31b-e6774e470913\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.696925 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6f94\" (UniqueName: \"kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94\") pod \"87e1b144-75c6-4ccc-b31b-e6774e470913\" (UID: \"87e1b144-75c6-4ccc-b31b-e6774e470913\") " Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.706054 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94" (OuterVolumeSpecName: "kube-api-access-r6f94") pod "87e1b144-75c6-4ccc-b31b-e6774e470913" (UID: "87e1b144-75c6-4ccc-b31b-e6774e470913"). InnerVolumeSpecName "kube-api-access-r6f94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.743948 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config" (OuterVolumeSpecName: "config") pod "87e1b144-75c6-4ccc-b31b-e6774e470913" (UID: "87e1b144-75c6-4ccc-b31b-e6774e470913"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.744395 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87e1b144-75c6-4ccc-b31b-e6774e470913" (UID: "87e1b144-75c6-4ccc-b31b-e6774e470913"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.747772 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87e1b144-75c6-4ccc-b31b-e6774e470913" (UID: "87e1b144-75c6-4ccc-b31b-e6774e470913"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.754829 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87e1b144-75c6-4ccc-b31b-e6774e470913" (UID: "87e1b144-75c6-4ccc-b31b-e6774e470913"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.798638 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.798678 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.798692 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6f94\" (UniqueName: \"kubernetes.io/projected/87e1b144-75c6-4ccc-b31b-e6774e470913-kube-api-access-r6f94\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.798706 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.798719 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87e1b144-75c6-4ccc-b31b-e6774e470913-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.808328 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-94grw"] Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.820389 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-59s9b"] Oct 03 07:15:58 crc kubenswrapper[4810]: I1003 07:15:58.950525 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2llk2"] Oct 03 07:15:58 crc kubenswrapper[4810]: W1003 07:15:58.955759 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dbb50cd_f5bf_48b4_95a9_0f0f42725c27.slice/crio-408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5 WatchSource:0}: Error finding container 408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5: Status 404 returned error can't find the container with id 408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5 Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.011306 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kj8vc"] Oct 03 07:15:59 crc kubenswrapper[4810]: W1003 07:15:59.072882 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81605a74_99a2_4c21_8bac_8c4b64c2164f.slice/crio-eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e WatchSource:0}: Error finding container eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e: Status 404 returned error can't find the container with id eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.468462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" event={"ID":"87e1b144-75c6-4ccc-b31b-e6774e470913","Type":"ContainerDied","Data":"3b36d24fd67bfb7a108d4946e63dbeb2fad5e0370a03b3fc1ce55fe0a9900bf3"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.468496 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69d4d5cdc5-649t6" Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.468820 4810 scope.go:117] "RemoveContainer" containerID="11335dcef00d17dd7b3e49d15f49dab20101994b28fccfcb5df8c98a29d836ff" Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.474526 4810 generic.go:334] "Generic (PLEG): container finished" podID="0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" containerID="19bc7021dff026f5ac16902ab90bf92bbb2ae86b78f15f02e6fa93154a4ef524" exitCode=0 Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.474583 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2llk2" event={"ID":"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27","Type":"ContainerDied","Data":"19bc7021dff026f5ac16902ab90bf92bbb2ae86b78f15f02e6fa93154a4ef524"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.474611 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2llk2" event={"ID":"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27","Type":"ContainerStarted","Data":"408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.478192 4810 generic.go:334] "Generic (PLEG): container finished" podID="81605a74-99a2-4c21-8bac-8c4b64c2164f" containerID="604fbe674101eea5cb7e0d47c285c60a0094065521648f16611c121922cc9a2f" exitCode=0 Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.478283 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kj8vc" event={"ID":"81605a74-99a2-4c21-8bac-8c4b64c2164f","Type":"ContainerDied","Data":"604fbe674101eea5cb7e0d47c285c60a0094065521648f16611c121922cc9a2f"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.478326 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kj8vc" event={"ID":"81605a74-99a2-4c21-8bac-8c4b64c2164f","Type":"ContainerStarted","Data":"eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.481358 4810 generic.go:334] "Generic (PLEG): container finished" podID="4620f70b-3da3-4486-a8d7-1c784246d780" containerID="f56a3b36c420c607e23d9871343e804d3def38c7f9f58143a8bb5e35ed03e6e3" exitCode=0 Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.481499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-94grw" event={"ID":"4620f70b-3da3-4486-a8d7-1c784246d780","Type":"ContainerDied","Data":"f56a3b36c420c607e23d9871343e804d3def38c7f9f58143a8bb5e35ed03e6e3"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.481527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-94grw" event={"ID":"4620f70b-3da3-4486-a8d7-1c784246d780","Type":"ContainerStarted","Data":"b32ed9704cc7900de07e4830650d6ca5369bb63774ba4a854a902f1254aacab9"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.485797 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-59s9b" event={"ID":"e85501ad-bec4-4381-a692-07aa40c0fc57","Type":"ContainerStarted","Data":"0d82762fd1d9dcf773a48e10b11c4ad3f7c782df8e8b0dabfa2d871b3660e081"} Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.491418 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.493011 4810 scope.go:117] "RemoveContainer" containerID="ee351051df1f3457a08e6b5a900ece536b6e9bd569a7765104faa50c35d383fe" Oct 03 07:15:59 crc kubenswrapper[4810]: I1003 07:15:59.497255 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69d4d5cdc5-649t6"] Oct 03 07:16:01 crc kubenswrapper[4810]: I1003 07:16:01.319553 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" path="/var/lib/kubelet/pods/87e1b144-75c6-4ccc-b31b-e6774e470913/volumes" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.514382 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kj8vc" event={"ID":"81605a74-99a2-4c21-8bac-8c4b64c2164f","Type":"ContainerDied","Data":"eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e"} Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.514746 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eebd79f91505218421aa01e4320e90d57699d2089a9743e489df25c4d191087e" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.516569 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-94grw" event={"ID":"4620f70b-3da3-4486-a8d7-1c784246d780","Type":"ContainerDied","Data":"b32ed9704cc7900de07e4830650d6ca5369bb63774ba4a854a902f1254aacab9"} Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.516596 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b32ed9704cc7900de07e4830650d6ca5369bb63774ba4a854a902f1254aacab9" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.518930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2llk2" event={"ID":"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27","Type":"ContainerDied","Data":"408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5"} Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.518980 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="408a7dcb340147306765a4e9d51f4c59b206db180128f22345fa000c57d97de5" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.646239 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-94grw" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.652234 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kj8vc" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.678525 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7cbg\" (UniqueName: \"kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg\") pod \"4620f70b-3da3-4486-a8d7-1c784246d780\" (UID: \"4620f70b-3da3-4486-a8d7-1c784246d780\") " Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.678703 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrwv7\" (UniqueName: \"kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7\") pod \"81605a74-99a2-4c21-8bac-8c4b64c2164f\" (UID: \"81605a74-99a2-4c21-8bac-8c4b64c2164f\") " Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.680040 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2llk2" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.692449 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg" (OuterVolumeSpecName: "kube-api-access-h7cbg") pod "4620f70b-3da3-4486-a8d7-1c784246d780" (UID: "4620f70b-3da3-4486-a8d7-1c784246d780"). InnerVolumeSpecName "kube-api-access-h7cbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.693366 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7" (OuterVolumeSpecName: "kube-api-access-mrwv7") pod "81605a74-99a2-4c21-8bac-8c4b64c2164f" (UID: "81605a74-99a2-4c21-8bac-8c4b64c2164f"). InnerVolumeSpecName "kube-api-access-mrwv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.780329 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb6wd\" (UniqueName: \"kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd\") pod \"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27\" (UID: \"0dbb50cd-f5bf-48b4-95a9-0f0f42725c27\") " Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.780800 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7cbg\" (UniqueName: \"kubernetes.io/projected/4620f70b-3da3-4486-a8d7-1c784246d780-kube-api-access-h7cbg\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.780823 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrwv7\" (UniqueName: \"kubernetes.io/projected/81605a74-99a2-4c21-8bac-8c4b64c2164f-kube-api-access-mrwv7\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.783236 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd" (OuterVolumeSpecName: "kube-api-access-jb6wd") pod "0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" (UID: "0dbb50cd-f5bf-48b4-95a9-0f0f42725c27"). InnerVolumeSpecName "kube-api-access-jb6wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:02 crc kubenswrapper[4810]: I1003 07:16:02.882106 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb6wd\" (UniqueName: \"kubernetes.io/projected/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27-kube-api-access-jb6wd\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:03 crc kubenswrapper[4810]: I1003 07:16:03.534149 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kj8vc" Oct 03 07:16:03 crc kubenswrapper[4810]: I1003 07:16:03.534238 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-94grw" Oct 03 07:16:03 crc kubenswrapper[4810]: I1003 07:16:03.534171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-59s9b" event={"ID":"e85501ad-bec4-4381-a692-07aa40c0fc57","Type":"ContainerStarted","Data":"456574b4e24c97ad994ceb383265e419a3c7ecdc16cc8a3733edb174e95a36ad"} Oct 03 07:16:03 crc kubenswrapper[4810]: I1003 07:16:03.535768 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2llk2" Oct 03 07:16:03 crc kubenswrapper[4810]: I1003 07:16:03.557740 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-59s9b" podStartSLOduration=13.861661604 podStartE2EDuration="17.5577162s" podCreationTimestamp="2025-10-03 07:15:46 +0000 UTC" firstStartedPulling="2025-10-03 07:15:58.808168643 +0000 UTC m=+1192.235419398" lastFinishedPulling="2025-10-03 07:16:02.504223259 +0000 UTC m=+1195.931473994" observedRunningTime="2025-10-03 07:16:03.555156124 +0000 UTC m=+1196.982406869" watchObservedRunningTime="2025-10-03 07:16:03.5577162 +0000 UTC m=+1196.984966945" Oct 03 07:16:06 crc kubenswrapper[4810]: I1003 07:16:06.565542 4810 generic.go:334] "Generic (PLEG): container finished" podID="e85501ad-bec4-4381-a692-07aa40c0fc57" containerID="456574b4e24c97ad994ceb383265e419a3c7ecdc16cc8a3733edb174e95a36ad" exitCode=0 Oct 03 07:16:06 crc kubenswrapper[4810]: I1003 07:16:06.565653 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-59s9b" event={"ID":"e85501ad-bec4-4381-a692-07aa40c0fc57","Type":"ContainerDied","Data":"456574b4e24c97ad994ceb383265e419a3c7ecdc16cc8a3733edb174e95a36ad"} Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.898355 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-59s9b" Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.974426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle\") pod \"e85501ad-bec4-4381-a692-07aa40c0fc57\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.974484 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data\") pod \"e85501ad-bec4-4381-a692-07aa40c0fc57\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.974565 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px7qg\" (UniqueName: \"kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg\") pod \"e85501ad-bec4-4381-a692-07aa40c0fc57\" (UID: \"e85501ad-bec4-4381-a692-07aa40c0fc57\") " Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.983306 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg" (OuterVolumeSpecName: "kube-api-access-px7qg") pod "e85501ad-bec4-4381-a692-07aa40c0fc57" (UID: "e85501ad-bec4-4381-a692-07aa40c0fc57"). InnerVolumeSpecName "kube-api-access-px7qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:07 crc kubenswrapper[4810]: I1003 07:16:07.997956 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85501ad-bec4-4381-a692-07aa40c0fc57" (UID: "e85501ad-bec4-4381-a692-07aa40c0fc57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.014272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data" (OuterVolumeSpecName: "config-data") pod "e85501ad-bec4-4381-a692-07aa40c0fc57" (UID: "e85501ad-bec4-4381-a692-07aa40c0fc57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.076584 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.076718 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px7qg\" (UniqueName: \"kubernetes.io/projected/e85501ad-bec4-4381-a692-07aa40c0fc57-kube-api-access-px7qg\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.076783 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85501ad-bec4-4381-a692-07aa40c0fc57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.584869 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-59s9b" event={"ID":"e85501ad-bec4-4381-a692-07aa40c0fc57","Type":"ContainerDied","Data":"0d82762fd1d9dcf773a48e10b11c4ad3f7c782df8e8b0dabfa2d871b3660e081"} Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.585541 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d82762fd1d9dcf773a48e10b11c4ad3f7c782df8e8b0dabfa2d871b3660e081" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.585003 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-59s9b" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859400 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859799 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85501ad-bec4-4381-a692-07aa40c0fc57" containerName="keystone-db-sync" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859821 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85501ad-bec4-4381-a692-07aa40c0fc57" containerName="keystone-db-sync" Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859837 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859846 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859858 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81605a74-99a2-4c21-8bac-8c4b64c2164f" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859868 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="81605a74-99a2-4c21-8bac-8c4b64c2164f" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859916 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4620f70b-3da3-4486-a8d7-1c784246d780" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859924 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4620f70b-3da3-4486-a8d7-1c784246d780" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859940 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="init" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859947 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="init" Oct 03 07:16:08 crc kubenswrapper[4810]: E1003 07:16:08.859960 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.859967 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.860163 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4620f70b-3da3-4486-a8d7-1c784246d780" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.860181 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e1b144-75c6-4ccc-b31b-e6774e470913" containerName="dnsmasq-dns" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.860193 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="81605a74-99a2-4c21-8bac-8c4b64c2164f" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.860214 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" containerName="mariadb-database-create" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.860226 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85501ad-bec4-4381-a692-07aa40c0fc57" containerName="keystone-db-sync" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.861126 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.868424 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889474 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889494 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889519 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kcs\" (UniqueName: \"kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889552 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.889583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.890625 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-d9zjq"] Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.891759 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.896064 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.896267 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkbxn" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.896387 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.896064 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.905160 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d9zjq"] Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990553 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990586 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990625 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990650 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.990706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991115 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jp4l\" (UniqueName: \"kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991322 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991532 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991650 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.992529 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.992677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kcs\" (UniqueName: \"kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991536 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.992173 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.992485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:08 crc kubenswrapper[4810]: I1003 07:16:08.991709 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.014266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kcs\" (UniqueName: \"kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs\") pod \"dnsmasq-dns-7ff484bf57-5qzbg\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.071397 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.073251 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.076279 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.076590 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.084977 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094302 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094400 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jp4l\" (UniqueName: \"kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094419 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094440 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtrnr\" (UniqueName: \"kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094459 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094479 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094519 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094593 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.094610 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.099079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.099346 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.099821 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.103595 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.107225 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.121292 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jp4l\" (UniqueName: \"kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l\") pod \"keystone-bootstrap-d9zjq\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.178871 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.196333 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197162 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197331 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtrnr\" (UniqueName: \"kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197517 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197688 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.197871 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.200329 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.200535 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.202570 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.204812 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.207788 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.211402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.218377 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.224602 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtrnr\" (UniqueName: \"kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr\") pod \"ceilometer-0\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.291047 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.354406 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-67bvc"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.355494 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-67bvc"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.355514 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.357025 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.358555 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.360166 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.364331 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47jf6" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.364546 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.365490 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.395010 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411740 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411804 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411867 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411886 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411925 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs4wp\" (UniqueName: \"kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411964 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.411987 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7b8z\" (UniqueName: \"kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.412003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.412036 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.412056 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.412096 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.513998 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514439 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514495 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514599 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs4wp\" (UniqueName: \"kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514640 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514672 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7b8z\" (UniqueName: \"kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514729 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514753 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.514984 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.516087 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.516959 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.517214 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.517612 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.517714 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.520010 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.520355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.523003 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.535524 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7b8z\" (UniqueName: \"kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z\") pod \"placement-db-sync-67bvc\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.536048 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs4wp\" (UniqueName: \"kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp\") pod \"dnsmasq-dns-6c49cc7cc-fvzg7\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.727802 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.754110 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.765399 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.785147 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-d9zjq"] Oct 03 07:16:09 crc kubenswrapper[4810]: I1003 07:16:09.804914 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.223375 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-67bvc"] Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.315072 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:16:10 crc kubenswrapper[4810]: W1003 07:16:10.326048 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5864a44c_e8be_40af_a27d_9659ce6d532a.slice/crio-5bd71e0c5be807b576f87600610fc57dd3f8ca9661bd3a066e742e909358d32a WatchSource:0}: Error finding container 5bd71e0c5be807b576f87600610fc57dd3f8ca9661bd3a066e742e909358d32a: Status 404 returned error can't find the container with id 5bd71e0c5be807b576f87600610fc57dd3f8ca9661bd3a066e742e909358d32a Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.607842 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-67bvc" event={"ID":"c9581fa8-c37c-4695-8e8f-34e0213752dd","Type":"ContainerStarted","Data":"d23d93cde1be968568e1cde4c3a45edd85cc0a9e0a89bfbb235ab864ecc2425a"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.609097 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerStarted","Data":"87c2804c077e827e04b8217c0bf2a89781c723fbd52837718391f5d9ecceec3f"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.611154 4810 generic.go:334] "Generic (PLEG): container finished" podID="ce06d216-c4cf-4a11-8ec4-754dab1b02fc" containerID="15dc3f6bb3c9ebc3cf26b5954ab36160e3ba736a16f0fa6cf672ca95a7b1246f" exitCode=0 Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.611208 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" event={"ID":"ce06d216-c4cf-4a11-8ec4-754dab1b02fc","Type":"ContainerDied","Data":"15dc3f6bb3c9ebc3cf26b5954ab36160e3ba736a16f0fa6cf672ca95a7b1246f"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.611235 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" event={"ID":"ce06d216-c4cf-4a11-8ec4-754dab1b02fc","Type":"ContainerStarted","Data":"733037b233f79b17373019644bea0b5ee765063e4e837b580af3ef5abfc0d04e"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.613504 4810 generic.go:334] "Generic (PLEG): container finished" podID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerID="2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09" exitCode=0 Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.613560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" event={"ID":"5864a44c-e8be-40af-a27d-9659ce6d532a","Type":"ContainerDied","Data":"2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.613636 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" event={"ID":"5864a44c-e8be-40af-a27d-9659ce6d532a","Type":"ContainerStarted","Data":"5bd71e0c5be807b576f87600610fc57dd3f8ca9661bd3a066e742e909358d32a"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.615030 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9zjq" event={"ID":"e82ae733-3031-4bfd-8de7-1dd6621beb6b","Type":"ContainerStarted","Data":"2d48901a7cfa1029c525d00ad89dff9e258bc488c0aa6f201c065b594d0b5cd4"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.615070 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9zjq" event={"ID":"e82ae733-3031-4bfd-8de7-1dd6621beb6b","Type":"ContainerStarted","Data":"60fa74936e5d8340639f95b733f3d2b0c8c6ba2cffa1ff386c625f35ad9759a6"} Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.681789 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-d9zjq" podStartSLOduration=2.6817687980000002 podStartE2EDuration="2.681768798s" podCreationTimestamp="2025-10-03 07:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:10.679718194 +0000 UTC m=+1204.106968929" watchObservedRunningTime="2025-10-03 07:16:10.681768798 +0000 UTC m=+1204.109019533" Oct 03 07:16:10 crc kubenswrapper[4810]: I1003 07:16:10.869125 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044572 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044650 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044693 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044841 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5kcs\" (UniqueName: \"kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044884 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.044948 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config\") pod \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\" (UID: \"ce06d216-c4cf-4a11-8ec4-754dab1b02fc\") " Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.052332 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs" (OuterVolumeSpecName: "kube-api-access-n5kcs") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "kube-api-access-n5kcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.090110 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.092400 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.092751 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config" (OuterVolumeSpecName: "config") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.103816 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.106003 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce06d216-c4cf-4a11-8ec4-754dab1b02fc" (UID: "ce06d216-c4cf-4a11-8ec4-754dab1b02fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150224 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150258 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150268 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150277 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5kcs\" (UniqueName: \"kubernetes.io/projected/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-kube-api-access-n5kcs\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150288 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.150297 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce06d216-c4cf-4a11-8ec4-754dab1b02fc-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.171069 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.651643 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" event={"ID":"5864a44c-e8be-40af-a27d-9659ce6d532a","Type":"ContainerStarted","Data":"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926"} Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.652014 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.662396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" event={"ID":"ce06d216-c4cf-4a11-8ec4-754dab1b02fc","Type":"ContainerDied","Data":"733037b233f79b17373019644bea0b5ee765063e4e837b580af3ef5abfc0d04e"} Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.662487 4810 scope.go:117] "RemoveContainer" containerID="15dc3f6bb3c9ebc3cf26b5954ab36160e3ba736a16f0fa6cf672ca95a7b1246f" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.662441 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff484bf57-5qzbg" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.677392 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" podStartSLOduration=2.6773758819999998 podStartE2EDuration="2.677375882s" podCreationTimestamp="2025-10-03 07:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:11.675981525 +0000 UTC m=+1205.103232260" watchObservedRunningTime="2025-10-03 07:16:11.677375882 +0000 UTC m=+1205.104626617" Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.713082 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:11 crc kubenswrapper[4810]: I1003 07:16:11.739217 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff484bf57-5qzbg"] Oct 03 07:16:13 crc kubenswrapper[4810]: I1003 07:16:13.313651 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce06d216-c4cf-4a11-8ec4-754dab1b02fc" path="/var/lib/kubelet/pods/ce06d216-c4cf-4a11-8ec4-754dab1b02fc/volumes" Oct 03 07:16:14 crc kubenswrapper[4810]: I1003 07:16:14.695768 4810 generic.go:334] "Generic (PLEG): container finished" podID="e82ae733-3031-4bfd-8de7-1dd6621beb6b" containerID="2d48901a7cfa1029c525d00ad89dff9e258bc488c0aa6f201c065b594d0b5cd4" exitCode=0 Oct 03 07:16:14 crc kubenswrapper[4810]: I1003 07:16:14.695861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9zjq" event={"ID":"e82ae733-3031-4bfd-8de7-1dd6621beb6b","Type":"ContainerDied","Data":"2d48901a7cfa1029c525d00ad89dff9e258bc488c0aa6f201c065b594d0b5cd4"} Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.554783 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561031 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561221 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561356 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jp4l\" (UniqueName: \"kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561513 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561745 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.561844 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts\") pod \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\" (UID: \"e82ae733-3031-4bfd-8de7-1dd6621beb6b\") " Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.564497 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.567148 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l" (OuterVolumeSpecName: "kube-api-access-4jp4l") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "kube-api-access-4jp4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.567819 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts" (OuterVolumeSpecName: "scripts") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.570387 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.646514 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.649485 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data" (OuterVolumeSpecName: "config-data") pod "e82ae733-3031-4bfd-8de7-1dd6621beb6b" (UID: "e82ae733-3031-4bfd-8de7-1dd6621beb6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665478 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665521 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665535 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665547 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665560 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jp4l\" (UniqueName: \"kubernetes.io/projected/e82ae733-3031-4bfd-8de7-1dd6621beb6b-kube-api-access-4jp4l\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.665572 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e82ae733-3031-4bfd-8de7-1dd6621beb6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.742742 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerStarted","Data":"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb"} Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.764531 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-d9zjq" event={"ID":"e82ae733-3031-4bfd-8de7-1dd6621beb6b","Type":"ContainerDied","Data":"60fa74936e5d8340639f95b733f3d2b0c8c6ba2cffa1ff386c625f35ad9759a6"} Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.764579 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60fa74936e5d8340639f95b733f3d2b0c8c6ba2cffa1ff386c625f35ad9759a6" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.764634 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-d9zjq" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.780343 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-67bvc" event={"ID":"c9581fa8-c37c-4695-8e8f-34e0213752dd","Type":"ContainerStarted","Data":"17d9144dd433472f1a4fb68c490e1d20c3e7b4f654b68c31571cb1ea1e6b851c"} Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.794673 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0121-account-create-5kgnh"] Oct 03 07:16:16 crc kubenswrapper[4810]: E1003 07:16:16.797334 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82ae733-3031-4bfd-8de7-1dd6621beb6b" containerName="keystone-bootstrap" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.797375 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82ae733-3031-4bfd-8de7-1dd6621beb6b" containerName="keystone-bootstrap" Oct 03 07:16:16 crc kubenswrapper[4810]: E1003 07:16:16.797384 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce06d216-c4cf-4a11-8ec4-754dab1b02fc" containerName="init" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.797391 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce06d216-c4cf-4a11-8ec4-754dab1b02fc" containerName="init" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.797604 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce06d216-c4cf-4a11-8ec4-754dab1b02fc" containerName="init" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.797626 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82ae733-3031-4bfd-8de7-1dd6621beb6b" containerName="keystone-bootstrap" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.798280 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.806359 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.814383 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0121-account-create-5kgnh"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.823268 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-67bvc" podStartSLOduration=1.616299212 podStartE2EDuration="7.823239111s" podCreationTimestamp="2025-10-03 07:16:09 +0000 UTC" firstStartedPulling="2025-10-03 07:16:10.229433055 +0000 UTC m=+1203.656683790" lastFinishedPulling="2025-10-03 07:16:16.436372954 +0000 UTC m=+1209.863623689" observedRunningTime="2025-10-03 07:16:16.806587287 +0000 UTC m=+1210.233838022" watchObservedRunningTime="2025-10-03 07:16:16.823239111 +0000 UTC m=+1210.250489846" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.841320 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-d9zjq"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.850396 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-d9zjq"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.867470 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nvd\" (UniqueName: \"kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd\") pod \"barbican-0121-account-create-5kgnh\" (UID: \"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675\") " pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.886976 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8097-account-create-zgk9n"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.888240 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.891672 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.897542 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8097-account-create-zgk9n"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.935664 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fhzl5"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.936920 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.941279 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.942022 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.942304 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkbxn" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.949162 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968206 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968263 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968318 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968377 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwgq4\" (UniqueName: \"kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968412 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968435 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pm4\" (UniqueName: \"kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4\") pod \"cinder-8097-account-create-zgk9n\" (UID: \"3fcb22be-92b4-425f-ba84-5c2fac3cd183\") " pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968471 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.968503 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nvd\" (UniqueName: \"kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd\") pod \"barbican-0121-account-create-5kgnh\" (UID: \"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675\") " pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.974814 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fhzl5"] Oct 03 07:16:16 crc kubenswrapper[4810]: I1003 07:16:16.992332 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nvd\" (UniqueName: \"kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd\") pod \"barbican-0121-account-create-5kgnh\" (UID: \"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675\") " pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070392 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070447 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwgq4\" (UniqueName: \"kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070543 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070566 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pm4\" (UniqueName: \"kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4\") pod \"cinder-8097-account-create-zgk9n\" (UID: \"3fcb22be-92b4-425f-ba84-5c2fac3cd183\") " pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.070598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.076631 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.080818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.081584 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.083462 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.084244 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.092050 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwgq4\" (UniqueName: \"kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4\") pod \"keystone-bootstrap-fhzl5\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.096171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pm4\" (UniqueName: \"kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4\") pod \"cinder-8097-account-create-zgk9n\" (UID: \"3fcb22be-92b4-425f-ba84-5c2fac3cd183\") " pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.130223 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.218623 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-073c-account-create-nkrq2"] Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.220027 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-073c-account-create-nkrq2"] Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.220139 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.237559 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.253605 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.265603 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.273583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snj4c\" (UniqueName: \"kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c\") pod \"neutron-073c-account-create-nkrq2\" (UID: \"7d7e8bca-49ef-4f7b-b962-a215f9a42605\") " pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.329783 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82ae733-3031-4bfd-8de7-1dd6621beb6b" path="/var/lib/kubelet/pods/e82ae733-3031-4bfd-8de7-1dd6621beb6b/volumes" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.384697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snj4c\" (UniqueName: \"kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c\") pod \"neutron-073c-account-create-nkrq2\" (UID: \"7d7e8bca-49ef-4f7b-b962-a215f9a42605\") " pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.454847 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snj4c\" (UniqueName: \"kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c\") pod \"neutron-073c-account-create-nkrq2\" (UID: \"7d7e8bca-49ef-4f7b-b962-a215f9a42605\") " pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.527423 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0121-account-create-5kgnh"] Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.568170 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.813228 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0121-account-create-5kgnh" event={"ID":"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675","Type":"ContainerStarted","Data":"fb79f66c4670c3586706b7e5c013e8c7cc097bbb443a2256998ca9acefb9e5c9"} Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.817578 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xk8xp" event={"ID":"adfe75ef-eb52-4d1d-8164-716ea23ead8d","Type":"ContainerStarted","Data":"6d769ed9f0cf1f6249f4f26aa6c0432b45547792f38f58ec8b3c51b5293b812d"} Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.849750 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xk8xp" podStartSLOduration=3.238552089 podStartE2EDuration="39.849730678s" podCreationTimestamp="2025-10-03 07:15:38 +0000 UTC" firstStartedPulling="2025-10-03 07:15:39.826825147 +0000 UTC m=+1173.254075882" lastFinishedPulling="2025-10-03 07:16:16.438003726 +0000 UTC m=+1209.865254471" observedRunningTime="2025-10-03 07:16:17.838916796 +0000 UTC m=+1211.266167541" watchObservedRunningTime="2025-10-03 07:16:17.849730678 +0000 UTC m=+1211.276981413" Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.885787 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8097-account-create-zgk9n"] Oct 03 07:16:17 crc kubenswrapper[4810]: I1003 07:16:17.960524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fhzl5"] Oct 03 07:16:17 crc kubenswrapper[4810]: W1003 07:16:17.981182 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46bba23b_757b_4303_a92c_78cb8a956c35.slice/crio-82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c WatchSource:0}: Error finding container 82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c: Status 404 returned error can't find the container with id 82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.133824 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-073c-account-create-nkrq2"] Oct 03 07:16:18 crc kubenswrapper[4810]: W1003 07:16:18.354544 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7e8bca_49ef_4f7b_b962_a215f9a42605.slice/crio-d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2 WatchSource:0}: Error finding container d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2: Status 404 returned error can't find the container with id d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2 Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.828608 4810 generic.go:334] "Generic (PLEG): container finished" podID="f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" containerID="4edd2c79f8fe2026cb6b52aabaff3103779a04f7415a9f5bd4772503c6dbdd01" exitCode=0 Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.829012 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0121-account-create-5kgnh" event={"ID":"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675","Type":"ContainerDied","Data":"4edd2c79f8fe2026cb6b52aabaff3103779a04f7415a9f5bd4772503c6dbdd01"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.830642 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fhzl5" event={"ID":"46bba23b-757b-4303-a92c-78cb8a956c35","Type":"ContainerStarted","Data":"47d80e02d12804f4a96f6f83f5614932491036bb359019c9891d07577398c937"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.830697 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fhzl5" event={"ID":"46bba23b-757b-4303-a92c-78cb8a956c35","Type":"ContainerStarted","Data":"82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.831578 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-073c-account-create-nkrq2" event={"ID":"7d7e8bca-49ef-4f7b-b962-a215f9a42605","Type":"ContainerStarted","Data":"d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.837680 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8097-account-create-zgk9n" event={"ID":"3fcb22be-92b4-425f-ba84-5c2fac3cd183","Type":"ContainerDied","Data":"22a40c14e070770d0a321f3a95675ac6ccfbac50ee1ca196e1e371d5f2b757f2"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.837618 4810 generic.go:334] "Generic (PLEG): container finished" podID="3fcb22be-92b4-425f-ba84-5c2fac3cd183" containerID="22a40c14e070770d0a321f3a95675ac6ccfbac50ee1ca196e1e371d5f2b757f2" exitCode=0 Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.837856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8097-account-create-zgk9n" event={"ID":"3fcb22be-92b4-425f-ba84-5c2fac3cd183","Type":"ContainerStarted","Data":"681bbb5741fcaa58192f4da27823bfd16e097037f825654a12edde22723f43f4"} Oct 03 07:16:18 crc kubenswrapper[4810]: I1003 07:16:18.908566 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fhzl5" podStartSLOduration=2.908553208 podStartE2EDuration="2.908553208s" podCreationTimestamp="2025-10-03 07:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:18.905202752 +0000 UTC m=+1212.332453507" watchObservedRunningTime="2025-10-03 07:16:18.908553208 +0000 UTC m=+1212.335803943" Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.767270 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.821991 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.822563 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="dnsmasq-dns" containerID="cri-o://3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e" gracePeriod=10 Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.856590 4810 generic.go:334] "Generic (PLEG): container finished" podID="7d7e8bca-49ef-4f7b-b962-a215f9a42605" containerID="faf3614757fd2717eb1a45d712084253daff94f38893df384ccb4ba891e7c533" exitCode=0 Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.856675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-073c-account-create-nkrq2" event={"ID":"7d7e8bca-49ef-4f7b-b962-a215f9a42605","Type":"ContainerDied","Data":"faf3614757fd2717eb1a45d712084253daff94f38893df384ccb4ba891e7c533"} Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.860334 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9581fa8-c37c-4695-8e8f-34e0213752dd" containerID="17d9144dd433472f1a4fb68c490e1d20c3e7b4f654b68c31571cb1ea1e6b851c" exitCode=0 Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.860515 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-67bvc" event={"ID":"c9581fa8-c37c-4695-8e8f-34e0213752dd","Type":"ContainerDied","Data":"17d9144dd433472f1a4fb68c490e1d20c3e7b4f654b68c31571cb1ea1e6b851c"} Oct 03 07:16:19 crc kubenswrapper[4810]: I1003 07:16:19.865269 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerStarted","Data":"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0"} Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.394714 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.401353 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.473453 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.555063 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4nvd\" (UniqueName: \"kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd\") pod \"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675\" (UID: \"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.556012 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25pm4\" (UniqueName: \"kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4\") pod \"3fcb22be-92b4-425f-ba84-5c2fac3cd183\" (UID: \"3fcb22be-92b4-425f-ba84-5c2fac3cd183\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.561424 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd" (OuterVolumeSpecName: "kube-api-access-z4nvd") pod "f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" (UID: "f53ec84c-a2d7-4f16-9222-1e3ed0cfe675"). InnerVolumeSpecName "kube-api-access-z4nvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.561464 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4" (OuterVolumeSpecName: "kube-api-access-25pm4") pod "3fcb22be-92b4-425f-ba84-5c2fac3cd183" (UID: "3fcb22be-92b4-425f-ba84-5c2fac3cd183"). InnerVolumeSpecName "kube-api-access-25pm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657669 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657786 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657830 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl7kl\" (UniqueName: \"kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657965 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.657988 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc\") pod \"89d07cf8-f28d-4137-8eb6-cdf20040b346\" (UID: \"89d07cf8-f28d-4137-8eb6-cdf20040b346\") " Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.658330 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4nvd\" (UniqueName: \"kubernetes.io/projected/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675-kube-api-access-z4nvd\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.658345 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25pm4\" (UniqueName: \"kubernetes.io/projected/3fcb22be-92b4-425f-ba84-5c2fac3cd183-kube-api-access-25pm4\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.660882 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl" (OuterVolumeSpecName: "kube-api-access-sl7kl") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "kube-api-access-sl7kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.698634 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config" (OuterVolumeSpecName: "config") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.701160 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.705360 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.728202 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.740506 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89d07cf8-f28d-4137-8eb6-cdf20040b346" (UID: "89d07cf8-f28d-4137-8eb6-cdf20040b346"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761193 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761224 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl7kl\" (UniqueName: \"kubernetes.io/projected/89d07cf8-f28d-4137-8eb6-cdf20040b346-kube-api-access-sl7kl\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761236 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761243 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761252 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.761262 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89d07cf8-f28d-4137-8eb6-cdf20040b346-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.874134 4810 generic.go:334] "Generic (PLEG): container finished" podID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerID="3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e" exitCode=0 Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.874203 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.874215 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" event={"ID":"89d07cf8-f28d-4137-8eb6-cdf20040b346","Type":"ContainerDied","Data":"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e"} Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.874241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987bc7949-2cbfl" event={"ID":"89d07cf8-f28d-4137-8eb6-cdf20040b346","Type":"ContainerDied","Data":"06dcd3221d4c0ea357b6aeeb5175ccd79df323e9b6946da4558a5e869c7ca52b"} Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.874256 4810 scope.go:117] "RemoveContainer" containerID="3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.877825 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0121-account-create-5kgnh" event={"ID":"f53ec84c-a2d7-4f16-9222-1e3ed0cfe675","Type":"ContainerDied","Data":"fb79f66c4670c3586706b7e5c013e8c7cc097bbb443a2256998ca9acefb9e5c9"} Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.877866 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb79f66c4670c3586706b7e5c013e8c7cc097bbb443a2256998ca9acefb9e5c9" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.877868 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0121-account-create-5kgnh" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.880442 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8097-account-create-zgk9n" event={"ID":"3fcb22be-92b4-425f-ba84-5c2fac3cd183","Type":"ContainerDied","Data":"681bbb5741fcaa58192f4da27823bfd16e097037f825654a12edde22723f43f4"} Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.880482 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681bbb5741fcaa58192f4da27823bfd16e097037f825654a12edde22723f43f4" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.880495 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8097-account-create-zgk9n" Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.917076 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:16:20 crc kubenswrapper[4810]: I1003 07:16:20.932003 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987bc7949-2cbfl"] Oct 03 07:16:21 crc kubenswrapper[4810]: I1003 07:16:21.316443 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" path="/var/lib/kubelet/pods/89d07cf8-f28d-4137-8eb6-cdf20040b346/volumes" Oct 03 07:16:21 crc kubenswrapper[4810]: I1003 07:16:21.899610 4810 generic.go:334] "Generic (PLEG): container finished" podID="46bba23b-757b-4303-a92c-78cb8a956c35" containerID="47d80e02d12804f4a96f6f83f5614932491036bb359019c9891d07577398c937" exitCode=0 Oct 03 07:16:21 crc kubenswrapper[4810]: I1003 07:16:21.899657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fhzl5" event={"ID":"46bba23b-757b-4303-a92c-78cb8a956c35","Type":"ContainerDied","Data":"47d80e02d12804f4a96f6f83f5614932491036bb359019c9891d07577398c937"} Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.120367 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dth2s"] Oct 03 07:16:22 crc kubenswrapper[4810]: E1003 07:16:22.120925 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="dnsmasq-dns" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121004 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="dnsmasq-dns" Oct 03 07:16:22 crc kubenswrapper[4810]: E1003 07:16:22.121067 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121121 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: E1003 07:16:22.121183 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="init" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121231 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="init" Oct 03 07:16:22 crc kubenswrapper[4810]: E1003 07:16:22.121292 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fcb22be-92b4-425f-ba84-5c2fac3cd183" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121344 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fcb22be-92b4-425f-ba84-5c2fac3cd183" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121539 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121598 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fcb22be-92b4-425f-ba84-5c2fac3cd183" containerName="mariadb-account-create" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.121659 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d07cf8-f28d-4137-8eb6-cdf20040b346" containerName="dnsmasq-dns" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.122248 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.128543 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.129005 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.129034 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zm2x4" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.146808 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dth2s"] Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.222708 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-bpqps"] Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.223721 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.225840 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.226346 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qpvg4" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.234277 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bpqps"] Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.288918 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.288969 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.289117 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84wnv\" (UniqueName: \"kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.289135 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.289155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.289562 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.359637 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391048 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391095 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84wnv\" (UniqueName: \"kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391120 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391145 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391175 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391199 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391264 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391285 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.391330 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwl8\" (UniqueName: \"kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.392534 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.398290 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.400156 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.405624 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.410209 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84wnv\" (UniqueName: \"kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.416379 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data\") pod \"cinder-db-sync-dth2s\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.450966 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dth2s" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492141 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7b8z\" (UniqueName: \"kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z\") pod \"c9581fa8-c37c-4695-8e8f-34e0213752dd\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492241 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle\") pod \"c9581fa8-c37c-4695-8e8f-34e0213752dd\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492268 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts\") pod \"c9581fa8-c37c-4695-8e8f-34e0213752dd\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs\") pod \"c9581fa8-c37c-4695-8e8f-34e0213752dd\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492452 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data\") pod \"c9581fa8-c37c-4695-8e8f-34e0213752dd\" (UID: \"c9581fa8-c37c-4695-8e8f-34e0213752dd\") " Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwl8\" (UniqueName: \"kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492795 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.492840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.495847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs" (OuterVolumeSpecName: "logs") pod "c9581fa8-c37c-4695-8e8f-34e0213752dd" (UID: "c9581fa8-c37c-4695-8e8f-34e0213752dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.498061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts" (OuterVolumeSpecName: "scripts") pod "c9581fa8-c37c-4695-8e8f-34e0213752dd" (UID: "c9581fa8-c37c-4695-8e8f-34e0213752dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.499570 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z" (OuterVolumeSpecName: "kube-api-access-p7b8z") pod "c9581fa8-c37c-4695-8e8f-34e0213752dd" (UID: "c9581fa8-c37c-4695-8e8f-34e0213752dd"). InnerVolumeSpecName "kube-api-access-p7b8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.500738 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.502558 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.511686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwl8\" (UniqueName: \"kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8\") pod \"barbican-db-sync-bpqps\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.529073 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9581fa8-c37c-4695-8e8f-34e0213752dd" (UID: "c9581fa8-c37c-4695-8e8f-34e0213752dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.532521 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data" (OuterVolumeSpecName: "config-data") pod "c9581fa8-c37c-4695-8e8f-34e0213752dd" (UID: "c9581fa8-c37c-4695-8e8f-34e0213752dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.555108 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.595052 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.595081 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7b8z\" (UniqueName: \"kubernetes.io/projected/c9581fa8-c37c-4695-8e8f-34e0213752dd-kube-api-access-p7b8z\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.595090 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.595099 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9581fa8-c37c-4695-8e8f-34e0213752dd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.595108 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9581fa8-c37c-4695-8e8f-34e0213752dd-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.923290 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-67bvc" Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.923403 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-67bvc" event={"ID":"c9581fa8-c37c-4695-8e8f-34e0213752dd","Type":"ContainerDied","Data":"d23d93cde1be968568e1cde4c3a45edd85cc0a9e0a89bfbb235ab864ecc2425a"} Oct 03 07:16:22 crc kubenswrapper[4810]: I1003 07:16:22.923444 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23d93cde1be968568e1cde4c3a45edd85cc0a9e0a89bfbb235ab864ecc2425a" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.551862 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:16:23 crc kubenswrapper[4810]: E1003 07:16:23.552603 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9581fa8-c37c-4695-8e8f-34e0213752dd" containerName="placement-db-sync" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.552628 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9581fa8-c37c-4695-8e8f-34e0213752dd" containerName="placement-db-sync" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.552958 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9581fa8-c37c-4695-8e8f-34e0213752dd" containerName="placement-db-sync" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.554101 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.566645 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.570844 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.571251 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.573808 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.573956 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-47jf6" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.574095 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718442 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718582 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718745 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d656\" (UniqueName: \"kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.718877 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.820829 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.820991 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.821067 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.821119 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.821184 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.821226 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.821302 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d656\" (UniqueName: \"kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.822412 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.826370 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.826721 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.828216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.830163 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.830325 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.842871 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d656\" (UniqueName: \"kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656\") pod \"placement-75d48cc7d6-2knhz\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:23 crc kubenswrapper[4810]: I1003 07:16:23.891966 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.055291 4810 scope.go:117] "RemoveContainer" containerID="c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.156113 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.165540 4810 scope.go:117] "RemoveContainer" containerID="3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e" Oct 03 07:16:24 crc kubenswrapper[4810]: E1003 07:16:24.167669 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e\": container with ID starting with 3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e not found: ID does not exist" containerID="3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.167715 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e"} err="failed to get container status \"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e\": rpc error: code = NotFound desc = could not find container \"3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e\": container with ID starting with 3f1c1951e9dfd2f96cb479e6a79d1a237ef01782690b24a854ca94439f04419e not found: ID does not exist" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.167746 4810 scope.go:117] "RemoveContainer" containerID="c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776" Oct 03 07:16:24 crc kubenswrapper[4810]: E1003 07:16:24.168672 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776\": container with ID starting with c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776 not found: ID does not exist" containerID="c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.168727 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776"} err="failed to get container status \"c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776\": rpc error: code = NotFound desc = could not find container \"c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776\": container with ID starting with c8dee34ac3a5c8a43bb2a928a26f7d76d2916813de49ef492f512f391366f776 not found: ID does not exist" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.171298 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332160 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwgq4\" (UniqueName: \"kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332226 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332287 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332378 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332407 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snj4c\" (UniqueName: \"kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c\") pod \"7d7e8bca-49ef-4f7b-b962-a215f9a42605\" (UID: \"7d7e8bca-49ef-4f7b-b962-a215f9a42605\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332463 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.332519 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys\") pod \"46bba23b-757b-4303-a92c-78cb8a956c35\" (UID: \"46bba23b-757b-4303-a92c-78cb8a956c35\") " Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.347201 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.355226 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts" (OuterVolumeSpecName: "scripts") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.356556 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c" (OuterVolumeSpecName: "kube-api-access-snj4c") pod "7d7e8bca-49ef-4f7b-b962-a215f9a42605" (UID: "7d7e8bca-49ef-4f7b-b962-a215f9a42605"). InnerVolumeSpecName "kube-api-access-snj4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.357132 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4" (OuterVolumeSpecName: "kube-api-access-gwgq4") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "kube-api-access-gwgq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.360227 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.362196 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data" (OuterVolumeSpecName: "config-data") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.370048 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46bba23b-757b-4303-a92c-78cb8a956c35" (UID: "46bba23b-757b-4303-a92c-78cb8a956c35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436206 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436244 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snj4c\" (UniqueName: \"kubernetes.io/projected/7d7e8bca-49ef-4f7b-b962-a215f9a42605-kube-api-access-snj4c\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436259 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436271 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436283 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwgq4\" (UniqueName: \"kubernetes.io/projected/46bba23b-757b-4303-a92c-78cb8a956c35-kube-api-access-gwgq4\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436293 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.436305 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/46bba23b-757b-4303-a92c-78cb8a956c35-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.601439 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dth2s"] Oct 03 07:16:24 crc kubenswrapper[4810]: W1003 07:16:24.604105 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4af26805_7114_468f_90a0_20e603b0d9a3.slice/crio-3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344 WatchSource:0}: Error finding container 3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344: Status 404 returned error can't find the container with id 3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344 Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.615207 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.722226 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-bpqps"] Oct 03 07:16:24 crc kubenswrapper[4810]: W1003 07:16:24.751698 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51bd34d5_28c5_4d82_960c_79061d4d221a.slice/crio-f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d WatchSource:0}: Error finding container f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d: Status 404 returned error can't find the container with id f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.948657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqps" event={"ID":"51bd34d5-28c5-4d82-960c-79061d4d221a","Type":"ContainerStarted","Data":"f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.951560 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fhzl5" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.951840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fhzl5" event={"ID":"46bba23b-757b-4303-a92c-78cb8a956c35","Type":"ContainerDied","Data":"82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.951890 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82336b2b23f4321345ab18ee952a034091f9a2d9f8d0252b6622cf6d331f850c" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.956540 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-073c-account-create-nkrq2" event={"ID":"7d7e8bca-49ef-4f7b-b962-a215f9a42605","Type":"ContainerDied","Data":"d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.956586 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e36ef770b316987d76c10895aa57ac66fb68a7155a533c61150569f63a93e2" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.956624 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-073c-account-create-nkrq2" Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.958301 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dth2s" event={"ID":"4af26805-7114-468f-90a0-20e603b0d9a3","Type":"ContainerStarted","Data":"3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.961211 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerStarted","Data":"db3e2a840110e13b015cb2f3ec8f381cda6865452f991a0abd017a3f0c638437"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.961259 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerStarted","Data":"3450f95d00e832442fafa3eaabe56f1ca4f5b15494c63b505249f8f1765d1adc"} Oct 03 07:16:24 crc kubenswrapper[4810]: I1003 07:16:24.964401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerStarted","Data":"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9"} Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.291795 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:16:25 crc kubenswrapper[4810]: E1003 07:16:25.309264 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46bba23b-757b-4303-a92c-78cb8a956c35" containerName="keystone-bootstrap" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.309312 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46bba23b-757b-4303-a92c-78cb8a956c35" containerName="keystone-bootstrap" Oct 03 07:16:25 crc kubenswrapper[4810]: E1003 07:16:25.309332 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7e8bca-49ef-4f7b-b962-a215f9a42605" containerName="mariadb-account-create" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.309341 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7e8bca-49ef-4f7b-b962-a215f9a42605" containerName="mariadb-account-create" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.309585 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7e8bca-49ef-4f7b-b962-a215f9a42605" containerName="mariadb-account-create" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.309606 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46bba23b-757b-4303-a92c-78cb8a956c35" containerName="keystone-bootstrap" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.313471 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.320507 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.329365 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.329622 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-dkbxn" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.329822 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.330049 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.330232 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.355555 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455113 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trkk7\" (UniqueName: \"kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455647 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455712 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455784 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.455870 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.557720 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.557784 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.557856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.557914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.557967 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.558030 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trkk7\" (UniqueName: \"kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.558066 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.558109 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.564428 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.564802 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.565647 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.569989 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.570032 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.570555 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.575529 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.579365 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trkk7\" (UniqueName: \"kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7\") pod \"keystone-5b9f9f6fd8-z8ls6\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.660198 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.979113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerStarted","Data":"1f17eff80bda95e4a3a828da788b685a51858f514dff3cf24b3bd09f553d4640"} Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.980804 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:25 crc kubenswrapper[4810]: I1003 07:16:25.980835 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.013140 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75d48cc7d6-2knhz" podStartSLOduration=3.013123248 podStartE2EDuration="3.013123248s" podCreationTimestamp="2025-10-03 07:16:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:26.004376711 +0000 UTC m=+1219.431627476" watchObservedRunningTime="2025-10-03 07:16:26.013123248 +0000 UTC m=+1219.440373983" Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.165591 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:16:26 crc kubenswrapper[4810]: W1003 07:16:26.173415 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda856067b_dba0_4860_93c3_ff4650760a4b.slice/crio-c2b56e5b7c279a06313e12ec070d16cdcd9658d80d73bb0c08f6f925a91535f6 WatchSource:0}: Error finding container c2b56e5b7c279a06313e12ec070d16cdcd9658d80d73bb0c08f6f925a91535f6: Status 404 returned error can't find the container with id c2b56e5b7c279a06313e12ec070d16cdcd9658d80d73bb0c08f6f925a91535f6 Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.988001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9f9f6fd8-z8ls6" event={"ID":"a856067b-dba0-4860-93c3-ff4650760a4b","Type":"ContainerStarted","Data":"accc7eedd538c9a716310f485205e12a3004a700295626a4fb4494a1c3cf497b"} Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.988364 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9f9f6fd8-z8ls6" event={"ID":"a856067b-dba0-4860-93c3-ff4650760a4b","Type":"ContainerStarted","Data":"c2b56e5b7c279a06313e12ec070d16cdcd9658d80d73bb0c08f6f925a91535f6"} Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.988378 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.990225 4810 generic.go:334] "Generic (PLEG): container finished" podID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" containerID="6d769ed9f0cf1f6249f4f26aa6c0432b45547792f38f58ec8b3c51b5293b812d" exitCode=0 Oct 03 07:16:26 crc kubenswrapper[4810]: I1003 07:16:26.990553 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xk8xp" event={"ID":"adfe75ef-eb52-4d1d-8164-716ea23ead8d","Type":"ContainerDied","Data":"6d769ed9f0cf1f6249f4f26aa6c0432b45547792f38f58ec8b3c51b5293b812d"} Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.015378 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b9f9f6fd8-z8ls6" podStartSLOduration=2.015356295 podStartE2EDuration="2.015356295s" podCreationTimestamp="2025-10-03 07:16:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:27.010736865 +0000 UTC m=+1220.437987620" watchObservedRunningTime="2025-10-03 07:16:27.015356295 +0000 UTC m=+1220.442607060" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.482544 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ds94g"] Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.484058 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.487994 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lkp4k" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.488204 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.488595 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.493842 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ds94g"] Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.593351 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.593691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f468p\" (UniqueName: \"kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.593718 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.695593 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.695698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f468p\" (UniqueName: \"kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.695729 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.704239 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.711974 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f468p\" (UniqueName: \"kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.712671 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config\") pod \"neutron-db-sync-ds94g\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:27 crc kubenswrapper[4810]: I1003 07:16:27.811379 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ds94g" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.260265 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xk8xp" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.290127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data\") pod \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.355564 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data" (OuterVolumeSpecName: "config-data") pod "adfe75ef-eb52-4d1d-8164-716ea23ead8d" (UID: "adfe75ef-eb52-4d1d-8164-716ea23ead8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.394973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4mrc\" (UniqueName: \"kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc\") pod \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.395055 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle\") pod \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.395099 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data\") pod \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\" (UID: \"adfe75ef-eb52-4d1d-8164-716ea23ead8d\") " Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.396241 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.404807 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "adfe75ef-eb52-4d1d-8164-716ea23ead8d" (UID: "adfe75ef-eb52-4d1d-8164-716ea23ead8d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.411322 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc" (OuterVolumeSpecName: "kube-api-access-z4mrc") pod "adfe75ef-eb52-4d1d-8164-716ea23ead8d" (UID: "adfe75ef-eb52-4d1d-8164-716ea23ead8d"). InnerVolumeSpecName "kube-api-access-z4mrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.428439 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adfe75ef-eb52-4d1d-8164-716ea23ead8d" (UID: "adfe75ef-eb52-4d1d-8164-716ea23ead8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.497602 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4mrc\" (UniqueName: \"kubernetes.io/projected/adfe75ef-eb52-4d1d-8164-716ea23ead8d-kube-api-access-z4mrc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.497647 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:32 crc kubenswrapper[4810]: I1003 07:16:32.497656 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/adfe75ef-eb52-4d1d-8164-716ea23ead8d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.068113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xk8xp" event={"ID":"adfe75ef-eb52-4d1d-8164-716ea23ead8d","Type":"ContainerDied","Data":"5e1d96a0ee045a06924b111fd595352b979eb7819ac78ec11fd496b3133e3388"} Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.068160 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e1d96a0ee045a06924b111fd595352b979eb7819ac78ec11fd496b3133e3388" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.068241 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xk8xp" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.712615 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:33 crc kubenswrapper[4810]: E1003 07:16:33.713035 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" containerName="glance-db-sync" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.713049 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" containerName="glance-db-sync" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.713215 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" containerName="glance-db-sync" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.714093 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717491 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717604 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717714 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzq8\" (UniqueName: \"kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.717743 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.738095 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.819829 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.819902 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.819996 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.820058 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.820127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzq8\" (UniqueName: \"kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.820147 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.880172 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.880490 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.880582 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.880611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.880862 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzq8\" (UniqueName: \"kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:33 crc kubenswrapper[4810]: I1003 07:16:33.881058 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb\") pod \"dnsmasq-dns-697b8b85f7-p542n\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.039594 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.609503 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.611703 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.621715 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.621854 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ktz64" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.622140 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.637996 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.741783 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.741833 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.741941 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.741982 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgnmp\" (UniqueName: \"kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.742146 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.742224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.742250 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.846811 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.846913 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.846949 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847058 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847083 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847196 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847229 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgnmp\" (UniqueName: \"kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847507 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.847800 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.848136 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.854749 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.856999 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.858519 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.863332 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.865291 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.866239 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.870643 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgnmp\" (UniqueName: \"kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.879651 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.917865 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:34 crc kubenswrapper[4810]: I1003 07:16:34.943089 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053097 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwsnv\" (UniqueName: \"kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053157 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053321 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053366 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053666 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.053746 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155335 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155384 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155427 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwsnv\" (UniqueName: \"kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155535 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155590 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.155613 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.156811 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.157833 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.157984 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.175545 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.176196 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.177531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.179021 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwsnv\" (UniqueName: \"kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.190096 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.292094 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.908979 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:35 crc kubenswrapper[4810]: I1003 07:16:35.963058 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:48 crc kubenswrapper[4810]: E1003 07:16:48.118488 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166" Oct 03 07:16:48 crc kubenswrapper[4810]: E1003 07:16:48.120444 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84wnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dth2s_openstack(4af26805-7114-468f-90a0-20e603b0d9a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 07:16:48 crc kubenswrapper[4810]: E1003 07:16:48.121774 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dth2s" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" Oct 03 07:16:48 crc kubenswrapper[4810]: E1003 07:16:48.218989 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:f4b02f57187855a6adb5b32d9a8ed92dea2376471c6e33783b4c45f4b56b0166\\\"\"" pod="openstack/cinder-db-sync-dth2s" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" Oct 03 07:16:49 crc kubenswrapper[4810]: E1003 07:16:49.218210 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48" Oct 03 07:16:49 crc kubenswrapper[4810]: E1003 07:16:49.218714 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qtrnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(b37c2b52-ef15-4044-91a4-7cbbda183db3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 07:16:49 crc kubenswrapper[4810]: E1003 07:16:49.220229 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" Oct 03 07:16:49 crc kubenswrapper[4810]: I1003 07:16:49.467003 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ds94g"] Oct 03 07:16:49 crc kubenswrapper[4810]: I1003 07:16:49.868553 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.190458 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.235981 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerID="82c73079ee91baea90d873524e9989bc148f60ee9858e1a6264f40a6221de216" exitCode=0 Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.236072 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" event={"ID":"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c","Type":"ContainerDied","Data":"82c73079ee91baea90d873524e9989bc148f60ee9858e1a6264f40a6221de216"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.236105 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" event={"ID":"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c","Type":"ContainerStarted","Data":"f8c51e23b74058cb015c23057ed5b472c04d63d7dec94c817d84ac42e8126268"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.240069 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerStarted","Data":"d010d1e01750ab8bcefc0ee8e5c4a9aa44393e95894e91ad12a71cd69eff4340"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.243743 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ds94g" event={"ID":"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6","Type":"ContainerStarted","Data":"9aed72a1383db33d14bca460e68cc2d23b09efa0d6304ea25bf268fbd94fc76f"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.243804 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ds94g" event={"ID":"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6","Type":"ContainerStarted","Data":"76c744ec03724a39826653c7bc3000e5a4e199f1050aedf7740dbf5fc275b9d5"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.246823 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-central-agent" containerID="cri-o://191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" gracePeriod=30 Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.247282 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqps" event={"ID":"51bd34d5-28c5-4d82-960c-79061d4d221a","Type":"ContainerStarted","Data":"c75eea83419638c2ba3c4d7f63f3fef697e22f91ee2cf79cf0191a60c2616f4c"} Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.247339 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="sg-core" containerID="cri-o://050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" gracePeriod=30 Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.247384 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-notification-agent" containerID="cri-o://2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" gracePeriod=30 Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.294230 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ds94g" podStartSLOduration=23.294130238 podStartE2EDuration="23.294130238s" podCreationTimestamp="2025-10-03 07:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:50.282474536 +0000 UTC m=+1243.709725281" watchObservedRunningTime="2025-10-03 07:16:50.294130238 +0000 UTC m=+1243.721380973" Oct 03 07:16:50 crc kubenswrapper[4810]: I1003 07:16:50.332112 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-bpqps" podStartSLOduration=3.860466634 podStartE2EDuration="28.332093645s" podCreationTimestamp="2025-10-03 07:16:22 +0000 UTC" firstStartedPulling="2025-10-03 07:16:24.758435816 +0000 UTC m=+1218.185686551" lastFinishedPulling="2025-10-03 07:16:49.230062817 +0000 UTC m=+1242.657313562" observedRunningTime="2025-10-03 07:16:50.325465418 +0000 UTC m=+1243.752716153" watchObservedRunningTime="2025-10-03 07:16:50.332093645 +0000 UTC m=+1243.759344380" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.015350 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168247 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168296 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168378 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168428 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168485 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168505 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtrnr\" (UniqueName: \"kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr\") pod \"b37c2b52-ef15-4044-91a4-7cbbda183db3\" (UID: \"b37c2b52-ef15-4044-91a4-7cbbda183db3\") " Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.168758 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.169120 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.170339 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.173139 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr" (OuterVolumeSpecName: "kube-api-access-qtrnr") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "kube-api-access-qtrnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.173541 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.175091 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts" (OuterVolumeSpecName: "scripts") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.200228 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.235641 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.251022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data" (OuterVolumeSpecName: "config-data") pod "b37c2b52-ef15-4044-91a4-7cbbda183db3" (UID: "b37c2b52-ef15-4044-91a4-7cbbda183db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271137 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b37c2b52-ef15-4044-91a4-7cbbda183db3-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271184 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271198 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtrnr\" (UniqueName: \"kubernetes.io/projected/b37c2b52-ef15-4044-91a4-7cbbda183db3-kube-api-access-qtrnr\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271212 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271225 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.271235 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b37c2b52-ef15-4044-91a4-7cbbda183db3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.276098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerStarted","Data":"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.279415 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerStarted","Data":"6b52e78fa2731f2e93abec8da29736bee1c63b814bc055722d485d4ee295d807"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285253 4810 generic.go:334] "Generic (PLEG): container finished" podID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" exitCode=2 Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285274 4810 generic.go:334] "Generic (PLEG): container finished" podID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" exitCode=0 Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285281 4810 generic.go:334] "Generic (PLEG): container finished" podID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" exitCode=0 Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285312 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerDied","Data":"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerDied","Data":"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285341 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerDied","Data":"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285350 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b37c2b52-ef15-4044-91a4-7cbbda183db3","Type":"ContainerDied","Data":"87c2804c077e827e04b8217c0bf2a89781c723fbd52837718391f5d9ecceec3f"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285364 4810 scope.go:117] "RemoveContainer" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.285472 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.299665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" event={"ID":"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c","Type":"ContainerStarted","Data":"21b99eeeb6d8e21aac24c84d31a27536b0707c6ccb674ebed6ebb8f7a229266c"} Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.300039 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.330916 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" podStartSLOduration=18.330886158 podStartE2EDuration="18.330886158s" podCreationTimestamp="2025-10-03 07:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:51.321152817 +0000 UTC m=+1244.748403572" watchObservedRunningTime="2025-10-03 07:16:51.330886158 +0000 UTC m=+1244.758136893" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.343090 4810 scope.go:117] "RemoveContainer" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.392582 4810 scope.go:117] "RemoveContainer" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.395845 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.430647 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.452158 4810 scope.go:117] "RemoveContainer" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.452983 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": container with ID starting with 050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9 not found: ID does not exist" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.453031 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9"} err="failed to get container status \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": rpc error: code = NotFound desc = could not find container \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": container with ID starting with 050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.453055 4810 scope.go:117] "RemoveContainer" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.453763 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": container with ID starting with 2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0 not found: ID does not exist" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.453784 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0"} err="failed to get container status \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": rpc error: code = NotFound desc = could not find container \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": container with ID starting with 2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.453796 4810 scope.go:117] "RemoveContainer" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.454104 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": container with ID starting with 191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb not found: ID does not exist" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.454126 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb"} err="failed to get container status \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": rpc error: code = NotFound desc = could not find container \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": container with ID starting with 191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.454141 4810 scope.go:117] "RemoveContainer" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.454524 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9"} err="failed to get container status \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": rpc error: code = NotFound desc = could not find container \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": container with ID starting with 050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.454541 4810 scope.go:117] "RemoveContainer" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.455442 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0"} err="failed to get container status \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": rpc error: code = NotFound desc = could not find container \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": container with ID starting with 2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.455463 4810 scope.go:117] "RemoveContainer" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.455842 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb"} err="failed to get container status \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": rpc error: code = NotFound desc = could not find container \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": container with ID starting with 191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.455858 4810 scope.go:117] "RemoveContainer" containerID="050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.456169 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9"} err="failed to get container status \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": rpc error: code = NotFound desc = could not find container \"050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9\": container with ID starting with 050a55b07965fe0021403794723a25815305af51f955db418970190d1fa725d9 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.456188 4810 scope.go:117] "RemoveContainer" containerID="2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.456775 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0"} err="failed to get container status \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": rpc error: code = NotFound desc = could not find container \"2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0\": container with ID starting with 2fdbd8d90cf75d585a21754538ad781317f11f1c9c522d4b2d091d4deb8b69b0 not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.456795 4810 scope.go:117] "RemoveContainer" containerID="191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.457617 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb"} err="failed to get container status \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": rpc error: code = NotFound desc = could not find container \"191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb\": container with ID starting with 191311856be15849489203ecca9cfd80745722a59c5870406856652177b00cdb not found: ID does not exist" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470109 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.470461 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-notification-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470477 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-notification-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.470495 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="sg-core" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470501 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="sg-core" Oct 03 07:16:51 crc kubenswrapper[4810]: E1003 07:16:51.470524 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-central-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470530 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-central-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470677 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="sg-core" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470700 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-central-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.470711 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" containerName="ceilometer-notification-agent" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.473058 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.475571 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.476561 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.490183 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495641 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495685 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbc2s\" (UniqueName: \"kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495709 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495772 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495795 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.495914 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.596997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597067 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597090 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbc2s\" (UniqueName: \"kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597111 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597166 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597815 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.597871 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.598176 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.598658 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.602402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.604589 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.618526 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.620428 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbc2s\" (UniqueName: \"kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.620639 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data\") pod \"ceilometer-0\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " pod="openstack/ceilometer-0" Oct 03 07:16:51 crc kubenswrapper[4810]: I1003 07:16:51.795089 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.278468 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:16:52 crc kubenswrapper[4810]: W1003 07:16:52.279371 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94390f78_7a32_4961_9567_0f2ddbced7cb.slice/crio-05c6a5551b75ae8c2772473b53f79cea8f0ea77d230dd5a902e1d139267bb3ca WatchSource:0}: Error finding container 05c6a5551b75ae8c2772473b53f79cea8f0ea77d230dd5a902e1d139267bb3ca: Status 404 returned error can't find the container with id 05c6a5551b75ae8c2772473b53f79cea8f0ea77d230dd5a902e1d139267bb3ca Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.308155 4810 generic.go:334] "Generic (PLEG): container finished" podID="51bd34d5-28c5-4d82-960c-79061d4d221a" containerID="c75eea83419638c2ba3c4d7f63f3fef697e22f91ee2cf79cf0191a60c2616f4c" exitCode=0 Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.308249 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqps" event={"ID":"51bd34d5-28c5-4d82-960c-79061d4d221a","Type":"ContainerDied","Data":"c75eea83419638c2ba3c4d7f63f3fef697e22f91ee2cf79cf0191a60c2616f4c"} Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.310445 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerStarted","Data":"05c6a5551b75ae8c2772473b53f79cea8f0ea77d230dd5a902e1d139267bb3ca"} Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.312702 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerStarted","Data":"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d"} Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.312826 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-log" containerID="cri-o://c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" gracePeriod=30 Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.313075 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-httpd" containerID="cri-o://6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" gracePeriod=30 Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.315274 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerStarted","Data":"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060"} Oct 03 07:16:52 crc kubenswrapper[4810]: I1003 07:16:52.887665 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.028728 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.028783 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.028870 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.028991 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwsnv\" (UniqueName: \"kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029032 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029105 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029163 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts\") pod \"d556725a-c8b9-48e1-b327-a6020916ff1c\" (UID: \"d556725a-c8b9-48e1-b327-a6020916ff1c\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029307 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029533 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.029622 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs" (OuterVolumeSpecName: "logs") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.033082 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv" (OuterVolumeSpecName: "kube-api-access-lwsnv") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "kube-api-access-lwsnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.034312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts" (OuterVolumeSpecName: "scripts") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.034605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.065941 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.096525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data" (OuterVolumeSpecName: "config-data") pod "d556725a-c8b9-48e1-b327-a6020916ff1c" (UID: "d556725a-c8b9-48e1-b327-a6020916ff1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.131877 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.131956 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwsnv\" (UniqueName: \"kubernetes.io/projected/d556725a-c8b9-48e1-b327-a6020916ff1c-kube-api-access-lwsnv\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.132022 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.132044 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.132067 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d556725a-c8b9-48e1-b327-a6020916ff1c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.132086 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d556725a-c8b9-48e1-b327-a6020916ff1c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.147915 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.233258 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.321079 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b37c2b52-ef15-4044-91a4-7cbbda183db3" path="/var/lib/kubelet/pods/b37c2b52-ef15-4044-91a4-7cbbda183db3/volumes" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329159 4810 generic.go:334] "Generic (PLEG): container finished" podID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerID="6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" exitCode=0 Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329190 4810 generic.go:334] "Generic (PLEG): container finished" podID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerID="c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" exitCode=143 Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329245 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329242 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerDied","Data":"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d"} Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329351 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerDied","Data":"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9"} Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329368 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d556725a-c8b9-48e1-b327-a6020916ff1c","Type":"ContainerDied","Data":"d010d1e01750ab8bcefc0ee8e5c4a9aa44393e95894e91ad12a71cd69eff4340"} Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.329385 4810 scope.go:117] "RemoveContainer" containerID="6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.331979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerStarted","Data":"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12"} Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.332147 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-log" containerID="cri-o://292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" gracePeriod=30 Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.332147 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-httpd" containerID="cri-o://d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" gracePeriod=30 Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.335978 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerStarted","Data":"80359cc6ebe5f16113bcbee03d6448ea22b034de859792d00bde22bffc5d05e6"} Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.378451 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=20.37842507 podStartE2EDuration="20.37842507s" podCreationTimestamp="2025-10-03 07:16:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:53.3746906 +0000 UTC m=+1246.801941355" watchObservedRunningTime="2025-10-03 07:16:53.37842507 +0000 UTC m=+1246.805675805" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.397462 4810 scope.go:117] "RemoveContainer" containerID="c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.419736 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.428806 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.443620 4810 scope.go:117] "RemoveContainer" containerID="6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" Oct 03 07:16:53 crc kubenswrapper[4810]: E1003 07:16:53.445355 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d\": container with ID starting with 6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d not found: ID does not exist" containerID="6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.445404 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d"} err="failed to get container status \"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d\": rpc error: code = NotFound desc = could not find container \"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d\": container with ID starting with 6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d not found: ID does not exist" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.445437 4810 scope.go:117] "RemoveContainer" containerID="c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.458398 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:53 crc kubenswrapper[4810]: E1003 07:16:53.458920 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-log" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.458936 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-log" Oct 03 07:16:53 crc kubenswrapper[4810]: E1003 07:16:53.458949 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-httpd" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.458957 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-httpd" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.459175 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-log" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.459193 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" containerName="glance-httpd" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.460295 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: E1003 07:16:53.463154 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9\": container with ID starting with c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9 not found: ID does not exist" containerID="c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.463197 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9"} err="failed to get container status \"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9\": rpc error: code = NotFound desc = could not find container \"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9\": container with ID starting with c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9 not found: ID does not exist" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.463226 4810 scope.go:117] "RemoveContainer" containerID="6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.463760 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d"} err="failed to get container status \"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d\": rpc error: code = NotFound desc = could not find container \"6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d\": container with ID starting with 6d92226773dda67ff087367ae934a76902642557563872c6c373950c4a4d347d not found: ID does not exist" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.463789 4810 scope.go:117] "RemoveContainer" containerID="c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.464051 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.464429 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9"} err="failed to get container status \"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9\": rpc error: code = NotFound desc = could not find container \"c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9\": container with ID starting with c649cf9b44500f7267b658b4d84583b514c2f53c59db527f12740becec2b4de9 not found: ID does not exist" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.464854 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.469182 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646270 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646409 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646471 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646508 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646558 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2fs5\" (UniqueName: \"kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646589 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646628 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.646658 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.726091 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.747964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748404 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748448 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748488 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2fs5\" (UniqueName: \"kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.748576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.749160 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.749648 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.753112 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.761272 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.761617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.761814 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.762776 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.773428 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2fs5\" (UniqueName: \"kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.802131 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.849999 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data\") pod \"51bd34d5-28c5-4d82-960c-79061d4d221a\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.850069 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle\") pod \"51bd34d5-28c5-4d82-960c-79061d4d221a\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.850188 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwl8\" (UniqueName: \"kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8\") pod \"51bd34d5-28c5-4d82-960c-79061d4d221a\" (UID: \"51bd34d5-28c5-4d82-960c-79061d4d221a\") " Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.854694 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "51bd34d5-28c5-4d82-960c-79061d4d221a" (UID: "51bd34d5-28c5-4d82-960c-79061d4d221a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.855434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8" (OuterVolumeSpecName: "kube-api-access-nzwl8") pod "51bd34d5-28c5-4d82-960c-79061d4d221a" (UID: "51bd34d5-28c5-4d82-960c-79061d4d221a"). InnerVolumeSpecName "kube-api-access-nzwl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.885572 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51bd34d5-28c5-4d82-960c-79061d4d221a" (UID: "51bd34d5-28c5-4d82-960c-79061d4d221a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.935785 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.952028 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.952074 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51bd34d5-28c5-4d82-960c-79061d4d221a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.952088 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwl8\" (UniqueName: \"kubernetes.io/projected/51bd34d5-28c5-4d82-960c-79061d4d221a-kube-api-access-nzwl8\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:53 crc kubenswrapper[4810]: I1003 07:16:53.996726 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.053699 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.053811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.053883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgnmp\" (UniqueName: \"kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.054144 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.054245 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.054361 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.054435 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\" (UID: \"276c80ca-c00a-43af-ac5c-b7c9e42c5127\") " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.058109 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs" (OuterVolumeSpecName: "logs") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.058194 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.058242 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.061614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts" (OuterVolumeSpecName: "scripts") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.061869 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp" (OuterVolumeSpecName: "kube-api-access-fgnmp") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "kube-api-access-fgnmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.090856 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.135763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data" (OuterVolumeSpecName: "config-data") pod "276c80ca-c00a-43af-ac5c-b7c9e42c5127" (UID: "276c80ca-c00a-43af-ac5c-b7c9e42c5127"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164725 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164763 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164776 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgnmp\" (UniqueName: \"kubernetes.io/projected/276c80ca-c00a-43af-ac5c-b7c9e42c5127-kube-api-access-fgnmp\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164790 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/276c80ca-c00a-43af-ac5c-b7c9e42c5127-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164798 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164807 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/276c80ca-c00a-43af-ac5c-b7c9e42c5127-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.164832 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.190079 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.266659 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356773 4810 generic.go:334] "Generic (PLEG): container finished" podID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerID="d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" exitCode=0 Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356804 4810 generic.go:334] "Generic (PLEG): container finished" podID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerID="292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" exitCode=143 Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356833 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerDied","Data":"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12"} Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356883 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerDied","Data":"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060"} Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356920 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"276c80ca-c00a-43af-ac5c-b7c9e42c5127","Type":"ContainerDied","Data":"6b52e78fa2731f2e93abec8da29736bee1c63b814bc055722d485d4ee295d807"} Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.356936 4810 scope.go:117] "RemoveContainer" containerID="d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.361587 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-bpqps" event={"ID":"51bd34d5-28c5-4d82-960c-79061d4d221a","Type":"ContainerDied","Data":"f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d"} Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.361678 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f96ea7c26c476680a741ce9dddec2817b87db5e180d8e1a7bcf58cc80984dc9d" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.361831 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-bpqps" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.391132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerStarted","Data":"d2644bca4e142921bdd5cc1ab34ada83cf13b6bef815c4417356086919802559"} Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.410938 4810 scope.go:117] "RemoveContainer" containerID="292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.451162 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.463095 4810 scope.go:117] "RemoveContainer" containerID="d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" Oct 03 07:16:54 crc kubenswrapper[4810]: E1003 07:16:54.464242 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12\": container with ID starting with d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12 not found: ID does not exist" containerID="d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.464295 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12"} err="failed to get container status \"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12\": rpc error: code = NotFound desc = could not find container \"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12\": container with ID starting with d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12 not found: ID does not exist" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.464326 4810 scope.go:117] "RemoveContainer" containerID="292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" Oct 03 07:16:54 crc kubenswrapper[4810]: E1003 07:16:54.468209 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060\": container with ID starting with 292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060 not found: ID does not exist" containerID="292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.468271 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060"} err="failed to get container status \"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060\": rpc error: code = NotFound desc = could not find container \"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060\": container with ID starting with 292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060 not found: ID does not exist" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.468297 4810 scope.go:117] "RemoveContainer" containerID="d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.468968 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.471721 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12"} err="failed to get container status \"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12\": rpc error: code = NotFound desc = could not find container \"d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12\": container with ID starting with d7163fe91620b53d0dcfdb5e4080fc837ffaf657110519b07e06d12f6434ff12 not found: ID does not exist" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.471750 4810 scope.go:117] "RemoveContainer" containerID="292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.474955 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060"} err="failed to get container status \"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060\": rpc error: code = NotFound desc = could not find container \"292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060\": container with ID starting with 292158a4917fe889b641eb14e168b2bb90ddd463e50e4d5daaee0e1e7ee19060 not found: ID does not exist" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.535846 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:54 crc kubenswrapper[4810]: E1003 07:16:54.536195 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-httpd" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536210 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-httpd" Oct 03 07:16:54 crc kubenswrapper[4810]: E1003 07:16:54.536227 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-log" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536232 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-log" Oct 03 07:16:54 crc kubenswrapper[4810]: E1003 07:16:54.536246 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51bd34d5-28c5-4d82-960c-79061d4d221a" containerName="barbican-db-sync" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536251 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="51bd34d5-28c5-4d82-960c-79061d4d221a" containerName="barbican-db-sync" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536424 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-httpd" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536450 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" containerName="glance-log" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.536466 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="51bd34d5-28c5-4d82-960c-79061d4d221a" containerName="barbican-db-sync" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.537359 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.544880 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.545802 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.566506 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:54 crc kubenswrapper[4810]: W1003 07:16:54.663733 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6569e1c4_c0f7_4146_9ba4_1bb855dfd9fc.slice/crio-f91d249919f132249c4bce7d4919cafc80e026839db412676f039cdae9db559f WatchSource:0}: Error finding container f91d249919f132249c4bce7d4919cafc80e026839db412676f039cdae9db559f: Status 404 returned error can't find the container with id f91d249919f132249c4bce7d4919cafc80e026839db412676f039cdae9db559f Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673781 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673848 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673880 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9pl\" (UniqueName: \"kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673914 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673974 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.673993 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.674032 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.678264 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775391 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775451 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775487 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9pl\" (UniqueName: \"kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775507 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775539 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775569 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775589 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.775628 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.776070 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.776335 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.776695 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.780479 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.781457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.782032 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.786440 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.805158 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9pl\" (UniqueName: \"kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.827079 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.861371 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.948085 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.950292 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.956952 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.957690 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.957805 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qpvg4" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.967768 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.969742 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:54 crc kubenswrapper[4810]: I1003 07:16:54.973826 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.001478 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.055282 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.084999 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085059 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085120 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085152 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085177 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzgt\" (UniqueName: \"kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085286 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085305 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.085332 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6zs\" (UniqueName: \"kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.117538 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.117833 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="dnsmasq-dns" containerID="cri-o://21b99eeeb6d8e21aac24c84d31a27536b0707c6ccb674ebed6ebb8f7a229266c" gracePeriod=10 Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.121829 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.125055 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.131471 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.152160 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186459 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186507 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186551 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186603 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186628 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghzgt\" (UniqueName: \"kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186666 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186698 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186764 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186780 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186810 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6zs\" (UniqueName: \"kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186868 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qrn\" (UniqueName: \"kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.186887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.187989 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.188593 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.204127 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.212777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.213276 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.215496 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.222849 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.223450 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.230429 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghzgt\" (UniqueName: \"kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt\") pod \"barbican-worker-75dfb4687f-c6pvn\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.239367 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.241025 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.242749 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.254795 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6zs\" (UniqueName: \"kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs\") pod \"barbican-keystone-listener-6887c7bd68-bxt6s\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.263968 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290140 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qrn\" (UniqueName: \"kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290200 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290224 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290243 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290278 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.290297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.291437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.291741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.291785 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.291920 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.293026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.354719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.389165 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="276c80ca-c00a-43af-ac5c-b7c9e42c5127" path="/var/lib/kubelet/pods/276c80ca-c00a-43af-ac5c-b7c9e42c5127/volumes" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.390073 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d556725a-c8b9-48e1-b327-a6020916ff1c" path="/var/lib/kubelet/pods/d556725a-c8b9-48e1-b327-a6020916ff1c/volumes" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.392457 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.393304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.393362 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.393388 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbg4z\" (UniqueName: \"kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.393541 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.400953 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.413403 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerStarted","Data":"9a534f1171d5b91c7f851d206340aac873299744de307830b38e5f24547a32c2"} Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.416029 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qrn\" (UniqueName: \"kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn\") pod \"dnsmasq-dns-78b6bb6597-jtfhd\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.463259 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerID="21b99eeeb6d8e21aac24c84d31a27536b0707c6ccb674ebed6ebb8f7a229266c" exitCode=0 Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.463376 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" event={"ID":"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c","Type":"ContainerDied","Data":"21b99eeeb6d8e21aac24c84d31a27536b0707c6ccb674ebed6ebb8f7a229266c"} Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.486986 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerStarted","Data":"f91d249919f132249c4bce7d4919cafc80e026839db412676f039cdae9db559f"} Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.497231 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.497277 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.497297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbg4z\" (UniqueName: \"kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.507328 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.508462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.508614 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.509999 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.520174 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.523477 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbg4z\" (UniqueName: \"kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.524740 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle\") pod \"barbican-api-764d46d444-sxpld\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.604402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.699965 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.755182 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.759396 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.806090 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.834050 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.834221 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.834337 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.834546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxzq8\" (UniqueName: \"kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.835251 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.835363 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc\") pod \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\" (UID: \"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c\") " Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.863609 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8" (OuterVolumeSpecName: "kube-api-access-sxzq8") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "kube-api-access-sxzq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.918162 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.919177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.925371 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config" (OuterVolumeSpecName: "config") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.938209 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.939550 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxzq8\" (UniqueName: \"kubernetes.io/projected/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-kube-api-access-sxzq8\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.939579 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.939610 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.939621 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.939631 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:55 crc kubenswrapper[4810]: I1003 07:16:55.975068 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" (UID: "ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.045797 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.053069 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.055670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.133046 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:16:56 crc kubenswrapper[4810]: W1003 07:16:56.144600 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d279481_fd9c_4c0f_b7d6_ce457c1a697e.slice/crio-4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9 WatchSource:0}: Error finding container 4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9: Status 404 returned error can't find the container with id 4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9 Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.162805 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.432023 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:16:56 crc kubenswrapper[4810]: W1003 07:16:56.508052 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod112c7019_188f_47ab_bf5f_a15fddf701eb.slice/crio-ead594214a97be9a0b22dd5beba058969ceb88a8e4bce54f040b309051cda3d7 WatchSource:0}: Error finding container ead594214a97be9a0b22dd5beba058969ceb88a8e4bce54f040b309051cda3d7: Status 404 returned error can't find the container with id ead594214a97be9a0b22dd5beba058969ceb88a8e4bce54f040b309051cda3d7 Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.527131 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerStarted","Data":"c742a0eb9bec9574678fd9f89424f9537fd85ffd8b31d33876ce03b40e6c0164"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.531057 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerStarted","Data":"5cde710dea441b447f0574eac0c9bed656d5494ca3ecea35c182c2590d8ea855"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.534870 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerStarted","Data":"024c48216f5fe8ff3dc58959f6971cecd1b851179685ac4bdb0d43c86358ed68"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.537843 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" event={"ID":"ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c","Type":"ContainerDied","Data":"f8c51e23b74058cb015c23057ed5b472c04d63d7dec94c817d84ac42e8126268"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.537907 4810 scope.go:117] "RemoveContainer" containerID="21b99eeeb6d8e21aac24c84d31a27536b0707c6ccb674ebed6ebb8f7a229266c" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.538016 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-697b8b85f7-p542n" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.570694 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerStarted","Data":"4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.573989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" event={"ID":"d446e6af-e752-4c4d-868a-414c0bfd0104","Type":"ContainerStarted","Data":"0d1eb6ebc941d2b6bfc30b0695b4b7d194e0a60e4a18b2b22b4c9a2129ce6d49"} Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.667383 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.674466 4810 scope.go:117] "RemoveContainer" containerID="82c73079ee91baea90d873524e9989bc148f60ee9858e1a6264f40a6221de216" Oct 03 07:16:56 crc kubenswrapper[4810]: I1003 07:16:56.680455 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-697b8b85f7-p542n"] Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.333576 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" path="/var/lib/kubelet/pods/ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c/volumes" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.602414 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerStarted","Data":"4fc1312223f6be73f6093c9c399d707a59a6c14f3f19c960d1f5dfc25309cff7"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.604564 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerStarted","Data":"756b6d336d0d23d3125897ab180755e567052e3baa802bfff2e486d6e8caf1be"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.604585 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerStarted","Data":"8860e0aa7676e2d97867298d2f6abc0271642fceb46d2405a1131cd31d8eaa7a"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.604595 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerStarted","Data":"ead594214a97be9a0b22dd5beba058969ceb88a8e4bce54f040b309051cda3d7"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.605555 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.605575 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.608861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerStarted","Data":"f31b0e45e6bc358ad5e97e631e4e62b1301370888dd93727bba7a6cef64bc41c"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.609430 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.615887 4810 generic.go:334] "Generic (PLEG): container finished" podID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerID="92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f" exitCode=0 Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.616034 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" event={"ID":"d446e6af-e752-4c4d-868a-414c0bfd0104","Type":"ContainerDied","Data":"92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.621998 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerStarted","Data":"0c95f1834c0b986a889fd50e7f4669a436ddbc7e4054263b236d8a8a96273957"} Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.623872 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-764d46d444-sxpld" podStartSLOduration=2.623856264 podStartE2EDuration="2.623856264s" podCreationTimestamp="2025-10-03 07:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:57.621275115 +0000 UTC m=+1251.048525850" watchObservedRunningTime="2025-10-03 07:16:57.623856264 +0000 UTC m=+1251.051106999" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.702997 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.467769764 podStartE2EDuration="6.702965314s" podCreationTimestamp="2025-10-03 07:16:51 +0000 UTC" firstStartedPulling="2025-10-03 07:16:52.281805707 +0000 UTC m=+1245.709056452" lastFinishedPulling="2025-10-03 07:16:56.517001267 +0000 UTC m=+1249.944252002" observedRunningTime="2025-10-03 07:16:57.669372274 +0000 UTC m=+1251.096623009" watchObservedRunningTime="2025-10-03 07:16:57.702965314 +0000 UTC m=+1251.130216049" Oct 03 07:16:57 crc kubenswrapper[4810]: I1003 07:16:57.721641 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.721623474 podStartE2EDuration="4.721623474s" podCreationTimestamp="2025-10-03 07:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:57.689562885 +0000 UTC m=+1251.116813630" watchObservedRunningTime="2025-10-03 07:16:57.721623474 +0000 UTC m=+1251.148874209" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.042126 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.432572 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:16:58 crc kubenswrapper[4810]: E1003 07:16:58.433259 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="dnsmasq-dns" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.433276 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="dnsmasq-dns" Oct 03 07:16:58 crc kubenswrapper[4810]: E1003 07:16:58.433294 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="init" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.433301 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="init" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.433479 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0f1d50-938a-459e-a2ea-cbbd6f00fe0c" containerName="dnsmasq-dns" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.434480 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.438189 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.438425 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.449156 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506763 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506843 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506903 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506936 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.506971 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.507013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmc8\" (UniqueName: \"kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608288 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608337 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmc8\" (UniqueName: \"kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608408 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608489 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.608925 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.623348 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.627846 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.631720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.632350 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.633476 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.635944 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmc8\" (UniqueName: \"kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8\") pod \"barbican-api-db7b99d8d-9txpw\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.647268 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerStarted","Data":"9ac4fcfc9f7cc315241fd9f62b9064b3b6483a02aae6e95db87cbafb1e876e0c"} Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.654057 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" event={"ID":"d446e6af-e752-4c4d-868a-414c0bfd0104","Type":"ContainerStarted","Data":"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2"} Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.678467 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.678448561 podStartE2EDuration="4.678448561s" podCreationTimestamp="2025-10-03 07:16:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:58.677365713 +0000 UTC m=+1252.104616448" watchObservedRunningTime="2025-10-03 07:16:58.678448561 +0000 UTC m=+1252.105699296" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.726836 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" podStartSLOduration=3.7268177380000003 podStartE2EDuration="3.726817738s" podCreationTimestamp="2025-10-03 07:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:16:58.725710568 +0000 UTC m=+1252.152961303" watchObservedRunningTime="2025-10-03 07:16:58.726817738 +0000 UTC m=+1252.154068473" Oct 03 07:16:58 crc kubenswrapper[4810]: I1003 07:16:58.751421 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.267139 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.490866 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.494705 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.500154 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h86hs" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.500405 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.500540 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.550000 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.652575 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: E1003 07:16:59.653246 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-gf2tt openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="f8d84e78-3768-423c-8c56-3e8a68033f3d" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.654310 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.654419 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2tt\" (UniqueName: \"kubernetes.io/projected/f8d84e78-3768-423c-8c56-3e8a68033f3d-kube-api-access-gf2tt\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.654466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.654708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.671636 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.675549 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerStarted","Data":"d9d3ff248266b41b739b029c6e20e9253b3a96fceeb907d9681d132dbed6d665"} Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.677400 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerStarted","Data":"116ca4730fd85c0b04786638fed840cf4e9ac1f674f214b442febeb58db8324f"} Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.680314 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerStarted","Data":"bab0144eacbf846c30a39d5010d0dd6c1fc6438f1979f0023937f3978740812f"} Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.680728 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.680764 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.688706 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.746545 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.747840 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.757964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.758013 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.758084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2tt\" (UniqueName: \"kubernetes.io/projected/f8d84e78-3768-423c-8c56-3e8a68033f3d-kube-api-access-gf2tt\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.758105 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.758865 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.759939 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.764656 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.772818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: E1003 07:16:59.773150 4810 projected.go:194] Error preparing data for projected volume kube-api-access-gf2tt for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f8d84e78-3768-423c-8c56-3e8a68033f3d) does not match the UID in record. The object might have been deleted and then recreated Oct 03 07:16:59 crc kubenswrapper[4810]: E1003 07:16:59.773291 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8d84e78-3768-423c-8c56-3e8a68033f3d-kube-api-access-gf2tt podName:f8d84e78-3768-423c-8c56-3e8a68033f3d nodeName:}" failed. No retries permitted until 2025-10-03 07:17:00.273272867 +0000 UTC m=+1253.700523602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gf2tt" (UniqueName: "kubernetes.io/projected/f8d84e78-3768-423c-8c56-3e8a68033f3d-kube-api-access-gf2tt") pod "openstackclient" (UID: "f8d84e78-3768-423c-8c56-3e8a68033f3d") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f8d84e78-3768-423c-8c56-3e8a68033f3d) does not match the UID in record. The object might have been deleted and then recreated Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.776360 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f8d84e78-3768-423c-8c56-3e8a68033f3d" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.859747 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config\") pod \"f8d84e78-3768-423c-8c56-3e8a68033f3d\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.859851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret\") pod \"f8d84e78-3768-423c-8c56-3e8a68033f3d\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.860003 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle\") pod \"f8d84e78-3768-423c-8c56-3e8a68033f3d\" (UID: \"f8d84e78-3768-423c-8c56-3e8a68033f3d\") " Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.860786 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.860840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.860872 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.860911 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gfd7\" (UniqueName: \"kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.861020 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf2tt\" (UniqueName: \"kubernetes.io/projected/f8d84e78-3768-423c-8c56-3e8a68033f3d-kube-api-access-gf2tt\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.863048 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f8d84e78-3768-423c-8c56-3e8a68033f3d" (UID: "f8d84e78-3768-423c-8c56-3e8a68033f3d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.868727 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8d84e78-3768-423c-8c56-3e8a68033f3d" (UID: "f8d84e78-3768-423c-8c56-3e8a68033f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.870996 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f8d84e78-3768-423c-8c56-3e8a68033f3d" (UID: "f8d84e78-3768-423c-8c56-3e8a68033f3d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.963601 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.963941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.963977 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.964006 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gfd7\" (UniqueName: \"kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.964648 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.964875 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.964916 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.964930 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d84e78-3768-423c-8c56-3e8a68033f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.968526 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.971583 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:16:59 crc kubenswrapper[4810]: I1003 07:16:59.986181 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gfd7\" (UniqueName: \"kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7\") pod \"openstackclient\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " pod="openstack/openstackclient" Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.108855 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.579086 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.689809 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerStarted","Data":"ee5aefdafa078a9fbd3fd95199dc8e09c3b1a213b866d27681b0e68a66781605"} Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.690993 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b3a80a81-d436-4fa8-b5b7-560348449df3","Type":"ContainerStarted","Data":"d0861ad4789737788e021684256220ee0ae2269810f9bb6d8bb65e9dbe130c94"} Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.692339 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerStarted","Data":"c466dfa6a59d96e0d7faf89c262bdc91ffef8e54dbc15010039c1aa7f1e0a15a"} Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.696117 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.699260 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerStarted","Data":"a7bf0333a5fea00209ffe7878043268796bb6130c3b253a0030e2a551d03432e"} Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.716055 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-75dfb4687f-c6pvn" podStartSLOduration=3.916154727 podStartE2EDuration="6.716039688s" podCreationTimestamp="2025-10-03 07:16:54 +0000 UTC" firstStartedPulling="2025-10-03 07:16:56.075772724 +0000 UTC m=+1249.503023449" lastFinishedPulling="2025-10-03 07:16:58.875657675 +0000 UTC m=+1252.302908410" observedRunningTime="2025-10-03 07:17:00.712327398 +0000 UTC m=+1254.139578153" watchObservedRunningTime="2025-10-03 07:17:00.716039688 +0000 UTC m=+1254.143290423" Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.738151 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f8d84e78-3768-423c-8c56-3e8a68033f3d" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" Oct 03 07:17:00 crc kubenswrapper[4810]: I1003 07:17:00.753130 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" podStartSLOduration=4.044609088 podStartE2EDuration="6.753110121s" podCreationTimestamp="2025-10-03 07:16:54 +0000 UTC" firstStartedPulling="2025-10-03 07:16:56.165142959 +0000 UTC m=+1249.592393694" lastFinishedPulling="2025-10-03 07:16:58.873643982 +0000 UTC m=+1252.300894727" observedRunningTime="2025-10-03 07:17:00.735072217 +0000 UTC m=+1254.162322972" watchObservedRunningTime="2025-10-03 07:17:00.753110121 +0000 UTC m=+1254.180360856" Oct 03 07:17:01 crc kubenswrapper[4810]: I1003 07:17:01.312304 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d84e78-3768-423c-8c56-3e8a68033f3d" path="/var/lib/kubelet/pods/f8d84e78-3768-423c-8c56-3e8a68033f3d/volumes" Oct 03 07:17:01 crc kubenswrapper[4810]: I1003 07:17:01.707759 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerStarted","Data":"150fc8415216347ff4c833f2ebf152af21b61c3aaf0e38388292d296069c03af"} Oct 03 07:17:01 crc kubenswrapper[4810]: I1003 07:17:01.708062 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:17:01 crc kubenswrapper[4810]: I1003 07:17:01.708102 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:17:01 crc kubenswrapper[4810]: I1003 07:17:01.735250 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-db7b99d8d-9txpw" podStartSLOduration=3.7352249669999997 podStartE2EDuration="3.735224967s" podCreationTimestamp="2025-10-03 07:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:01.723302207 +0000 UTC m=+1255.150552952" watchObservedRunningTime="2025-10-03 07:17:01.735224967 +0000 UTC m=+1255.162475702" Oct 03 07:17:02 crc kubenswrapper[4810]: I1003 07:17:02.088859 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:17:02 crc kubenswrapper[4810]: I1003 07:17:02.088948 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:17:03 crc kubenswrapper[4810]: I1003 07:17:03.997604 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:03 crc kubenswrapper[4810]: I1003 07:17:03.998029 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.045344 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.047557 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.737362 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.737790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.862495 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.862668 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.918408 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 07:17:04 crc kubenswrapper[4810]: I1003 07:17:04.933512 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 07:17:05 crc kubenswrapper[4810]: I1003 07:17:05.606885 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:17:05 crc kubenswrapper[4810]: I1003 07:17:05.712459 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:17:05 crc kubenswrapper[4810]: I1003 07:17:05.712699 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="dnsmasq-dns" containerID="cri-o://0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926" gracePeriod=10 Oct 03 07:17:05 crc kubenswrapper[4810]: I1003 07:17:05.750741 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 07:17:05 crc kubenswrapper[4810]: I1003 07:17:05.751129 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.746247 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.780767 4810 generic.go:334] "Generic (PLEG): container finished" podID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerID="0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926" exitCode=0 Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.781741 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.782112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" event={"ID":"5864a44c-e8be-40af-a27d-9659ce6d532a","Type":"ContainerDied","Data":"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926"} Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.782174 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c49cc7cc-fvzg7" event={"ID":"5864a44c-e8be-40af-a27d-9659ce6d532a","Type":"ContainerDied","Data":"5bd71e0c5be807b576f87600610fc57dd3f8ca9661bd3a066e742e909358d32a"} Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.782196 4810 scope.go:117] "RemoveContainer" containerID="0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.831352 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.831737 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.831936 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs4wp\" (UniqueName: \"kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.831962 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.832044 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.832087 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb\") pod \"5864a44c-e8be-40af-a27d-9659ce6d532a\" (UID: \"5864a44c-e8be-40af-a27d-9659ce6d532a\") " Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.857169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp" (OuterVolumeSpecName: "kube-api-access-fs4wp") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "kube-api-access-fs4wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.898212 4810 scope.go:117] "RemoveContainer" containerID="2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.936170 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs4wp\" (UniqueName: \"kubernetes.io/projected/5864a44c-e8be-40af-a27d-9659ce6d532a-kube-api-access-fs4wp\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.946588 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.964432 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:06 crc kubenswrapper[4810]: I1003 07:17:06.986257 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.008407 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.013485 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config" (OuterVolumeSpecName: "config") pod "5864a44c-e8be-40af-a27d-9659ce6d532a" (UID: "5864a44c-e8be-40af-a27d-9659ce6d532a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.039456 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.039502 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.039516 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.039531 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.039544 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5864a44c-e8be-40af-a27d-9659ce6d532a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.113563 4810 scope.go:117] "RemoveContainer" containerID="0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926" Oct 03 07:17:07 crc kubenswrapper[4810]: E1003 07:17:07.116498 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926\": container with ID starting with 0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926 not found: ID does not exist" containerID="0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.116533 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926"} err="failed to get container status \"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926\": rpc error: code = NotFound desc = could not find container \"0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926\": container with ID starting with 0fcd2cd855053606711d8aa6629d8c3c17f739d3cc3f2101fa8daac4ec685926 not found: ID does not exist" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.116561 4810 scope.go:117] "RemoveContainer" containerID="2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09" Oct 03 07:17:07 crc kubenswrapper[4810]: E1003 07:17:07.117804 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09\": container with ID starting with 2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09 not found: ID does not exist" containerID="2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.117833 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09"} err="failed to get container status \"2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09\": rpc error: code = NotFound desc = could not find container \"2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09\": container with ID starting with 2e399478d055687967da424191ef6149071690b18142a2bbff2d2c6412920e09 not found: ID does not exist" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.148056 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.154885 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c49cc7cc-fvzg7"] Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.318399 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" path="/var/lib/kubelet/pods/5864a44c-e8be-40af-a27d-9659ce6d532a/volumes" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.594974 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.595315 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.598635 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.698538 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.828975 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dth2s" event={"ID":"4af26805-7114-468f-90a0-20e603b0d9a3","Type":"ContainerStarted","Data":"9a80e38f2f7cb35c6df40631d07f6fa9af9ccb09c4a52542e08ea4afc6fc13d5"} Oct 03 07:17:07 crc kubenswrapper[4810]: I1003 07:17:07.870359 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dth2s" podStartSLOduration=3.974511083 podStartE2EDuration="45.870340473s" podCreationTimestamp="2025-10-03 07:16:22 +0000 UTC" firstStartedPulling="2025-10-03 07:16:24.60694758 +0000 UTC m=+1218.034198316" lastFinishedPulling="2025-10-03 07:17:06.502776971 +0000 UTC m=+1259.930027706" observedRunningTime="2025-10-03 07:17:07.846807462 +0000 UTC m=+1261.274058197" watchObservedRunningTime="2025-10-03 07:17:07.870340473 +0000 UTC m=+1261.297591208" Oct 03 07:17:08 crc kubenswrapper[4810]: I1003 07:17:08.036832 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:08 crc kubenswrapper[4810]: I1003 07:17:08.558096 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 07:17:08 crc kubenswrapper[4810]: I1003 07:17:08.558493 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 07:17:10 crc kubenswrapper[4810]: I1003 07:17:10.653562 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:17:10 crc kubenswrapper[4810]: I1003 07:17:10.964903 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.044993 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.045319 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764d46d444-sxpld" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api-log" containerID="cri-o://8860e0aa7676e2d97867298d2f6abc0271642fceb46d2405a1131cd31d8eaa7a" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.045640 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764d46d444-sxpld" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api" containerID="cri-o://756b6d336d0d23d3125897ab180755e567052e3baa802bfff2e486d6e8caf1be" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.323005 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:17:11 crc kubenswrapper[4810]: E1003 07:17:11.323631 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="init" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.323643 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="init" Oct 03 07:17:11 crc kubenswrapper[4810]: E1003 07:17:11.323682 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="dnsmasq-dns" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.323690 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="dnsmasq-dns" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.323854 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5864a44c-e8be-40af-a27d-9659ce6d532a" containerName="dnsmasq-dns" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.324749 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.324832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.328043 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.332419 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.332590 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427040 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427112 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427160 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dgt\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427210 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427431 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427513 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.427543 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.483610 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.483950 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-central-agent" containerID="cri-o://80359cc6ebe5f16113bcbee03d6448ea22b034de859792d00bde22bffc5d05e6" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.484082 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-notification-agent" containerID="cri-o://d2644bca4e142921bdd5cc1ab34ada83cf13b6bef815c4417356086919802559" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.484076 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="proxy-httpd" containerID="cri-o://f31b0e45e6bc358ad5e97e631e4e62b1301370888dd93727bba7a6cef64bc41c" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.484204 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="sg-core" containerID="cri-o://9a534f1171d5b91c7f851d206340aac873299744de307830b38e5f24547a32c2" gracePeriod=30 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.488887 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529597 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dgt\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529659 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529709 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529791 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529816 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529839 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.529873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.533714 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.535716 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.538471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.540795 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.557960 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.559787 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.562629 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.572943 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dgt\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt\") pod \"swift-proxy-766c6867bf-vjpbj\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.643810 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.896561 4810 generic.go:334] "Generic (PLEG): container finished" podID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerID="8860e0aa7676e2d97867298d2f6abc0271642fceb46d2405a1131cd31d8eaa7a" exitCode=143 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.896914 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerDied","Data":"8860e0aa7676e2d97867298d2f6abc0271642fceb46d2405a1131cd31d8eaa7a"} Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.902041 4810 generic.go:334] "Generic (PLEG): container finished" podID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerID="f31b0e45e6bc358ad5e97e631e4e62b1301370888dd93727bba7a6cef64bc41c" exitCode=0 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.902086 4810 generic.go:334] "Generic (PLEG): container finished" podID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerID="9a534f1171d5b91c7f851d206340aac873299744de307830b38e5f24547a32c2" exitCode=2 Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.902111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerDied","Data":"f31b0e45e6bc358ad5e97e631e4e62b1301370888dd93727bba7a6cef64bc41c"} Oct 03 07:17:11 crc kubenswrapper[4810]: I1003 07:17:11.902139 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerDied","Data":"9a534f1171d5b91c7f851d206340aac873299744de307830b38e5f24547a32c2"} Oct 03 07:17:12 crc kubenswrapper[4810]: W1003 07:17:12.273926 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0b20ced_eb2f_41a3_8988_611615af5759.slice/crio-0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973 WatchSource:0}: Error finding container 0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973: Status 404 returned error can't find the container with id 0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973 Oct 03 07:17:12 crc kubenswrapper[4810]: I1003 07:17:12.286334 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:17:12 crc kubenswrapper[4810]: I1003 07:17:12.929939 4810 generic.go:334] "Generic (PLEG): container finished" podID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerID="80359cc6ebe5f16113bcbee03d6448ea22b034de859792d00bde22bffc5d05e6" exitCode=0 Oct 03 07:17:12 crc kubenswrapper[4810]: I1003 07:17:12.930145 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerDied","Data":"80359cc6ebe5f16113bcbee03d6448ea22b034de859792d00bde22bffc5d05e6"} Oct 03 07:17:12 crc kubenswrapper[4810]: I1003 07:17:12.940753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerStarted","Data":"d1f756ac00c5c1c73d319dfc9ee7cf96f32a3e5421d41971071b05ee6e0f4c9e"} Oct 03 07:17:12 crc kubenswrapper[4810]: I1003 07:17:12.940812 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerStarted","Data":"0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973"} Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.686231 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.686680 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-log" containerID="cri-o://4fc1312223f6be73f6093c9c399d707a59a6c14f3f19c960d1f5dfc25309cff7" gracePeriod=30 Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.686901 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-httpd" containerID="cri-o://9ac4fcfc9f7cc315241fd9f62b9064b3b6483a02aae6e95db87cbafb1e876e0c" gracePeriod=30 Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.953705 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerStarted","Data":"3f01201fc73b718d0aea9571880ce1f2a30e6b5a5955dcbaa7b43e5aedc4fe65"} Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.954260 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.954307 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.956657 4810 generic.go:334] "Generic (PLEG): container finished" podID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerID="4fc1312223f6be73f6093c9c399d707a59a6c14f3f19c960d1f5dfc25309cff7" exitCode=143 Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.956710 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerDied","Data":"4fc1312223f6be73f6093c9c399d707a59a6c14f3f19c960d1f5dfc25309cff7"} Oct 03 07:17:13 crc kubenswrapper[4810]: I1003 07:17:13.980357 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-766c6867bf-vjpbj" podStartSLOduration=2.9803314370000002 podStartE2EDuration="2.980331437s" podCreationTimestamp="2025-10-03 07:17:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:13.973862824 +0000 UTC m=+1267.401113559" watchObservedRunningTime="2025-10-03 07:17:13.980331437 +0000 UTC m=+1267.407582172" Oct 03 07:17:14 crc kubenswrapper[4810]: I1003 07:17:14.971837 4810 generic.go:334] "Generic (PLEG): container finished" podID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerID="756b6d336d0d23d3125897ab180755e567052e3baa802bfff2e486d6e8caf1be" exitCode=0 Oct 03 07:17:14 crc kubenswrapper[4810]: I1003 07:17:14.971961 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerDied","Data":"756b6d336d0d23d3125897ab180755e567052e3baa802bfff2e486d6e8caf1be"} Oct 03 07:17:14 crc kubenswrapper[4810]: I1003 07:17:14.980833 4810 generic.go:334] "Generic (PLEG): container finished" podID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerID="d2644bca4e142921bdd5cc1ab34ada83cf13b6bef815c4417356086919802559" exitCode=0 Oct 03 07:17:14 crc kubenswrapper[4810]: I1003 07:17:14.980947 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerDied","Data":"d2644bca4e142921bdd5cc1ab34ada83cf13b6bef815c4417356086919802559"} Oct 03 07:17:15 crc kubenswrapper[4810]: I1003 07:17:15.700723 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764d46d444-sxpld" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": dial tcp 10.217.0.157:9311: connect: connection refused" Oct 03 07:17:15 crc kubenswrapper[4810]: I1003 07:17:15.700753 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764d46d444-sxpld" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": dial tcp 10.217.0.157:9311: connect: connection refused" Oct 03 07:17:15 crc kubenswrapper[4810]: I1003 07:17:15.992095 4810 generic.go:334] "Generic (PLEG): container finished" podID="4af26805-7114-468f-90a0-20e603b0d9a3" containerID="9a80e38f2f7cb35c6df40631d07f6fa9af9ccb09c4a52542e08ea4afc6fc13d5" exitCode=0 Oct 03 07:17:15 crc kubenswrapper[4810]: I1003 07:17:15.992149 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dth2s" event={"ID":"4af26805-7114-468f-90a0-20e603b0d9a3","Type":"ContainerDied","Data":"9a80e38f2f7cb35c6df40631d07f6fa9af9ccb09c4a52542e08ea4afc6fc13d5"} Oct 03 07:17:16 crc kubenswrapper[4810]: I1003 07:17:16.395789 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:16 crc kubenswrapper[4810]: I1003 07:17:16.396096 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-log" containerID="cri-o://c742a0eb9bec9574678fd9f89424f9537fd85ffd8b31d33876ce03b40e6c0164" gracePeriod=30 Oct 03 07:17:16 crc kubenswrapper[4810]: I1003 07:17:16.396267 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-httpd" containerID="cri-o://0c95f1834c0b986a889fd50e7f4669a436ddbc7e4054263b236d8a8a96273957" gracePeriod=30 Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.003938 4810 generic.go:334] "Generic (PLEG): container finished" podID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerID="c742a0eb9bec9574678fd9f89424f9537fd85ffd8b31d33876ce03b40e6c0164" exitCode=143 Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.004009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerDied","Data":"c742a0eb9bec9574678fd9f89424f9537fd85ffd8b31d33876ce03b40e6c0164"} Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.007385 4810 generic.go:334] "Generic (PLEG): container finished" podID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerID="9ac4fcfc9f7cc315241fd9f62b9064b3b6483a02aae6e95db87cbafb1e876e0c" exitCode=0 Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.007456 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerDied","Data":"9ac4fcfc9f7cc315241fd9f62b9064b3b6483a02aae6e95db87cbafb1e876e0c"} Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.761019 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9kkxh"] Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.762287 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.783574 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9kkxh"] Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.863247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbj5\" (UniqueName: \"kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5\") pod \"nova-api-db-create-9kkxh\" (UID: \"384a7a09-09bc-4322-b41c-073344b351ef\") " pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.869853 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-t2cwf"] Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.871270 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.882453 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t2cwf"] Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.965286 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbj5\" (UniqueName: \"kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5\") pod \"nova-api-db-create-9kkxh\" (UID: \"384a7a09-09bc-4322-b41c-073344b351ef\") " pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:17 crc kubenswrapper[4810]: I1003 07:17:17.991157 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbj5\" (UniqueName: \"kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5\") pod \"nova-api-db-create-9kkxh\" (UID: \"384a7a09-09bc-4322-b41c-073344b351ef\") " pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.076321 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-865ck"] Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.078049 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstzx\" (UniqueName: \"kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx\") pod \"nova-cell0-db-create-t2cwf\" (UID: \"f135c1f3-8f57-4fba-b0a3-b83ef524acfd\") " pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.078909 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.086640 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.098752 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-865ck"] Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.203303 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstzx\" (UniqueName: \"kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx\") pod \"nova-cell0-db-create-t2cwf\" (UID: \"f135c1f3-8f57-4fba-b0a3-b83ef524acfd\") " pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.233719 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstzx\" (UniqueName: \"kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx\") pod \"nova-cell0-db-create-t2cwf\" (UID: \"f135c1f3-8f57-4fba-b0a3-b83ef524acfd\") " pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.306350 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbtp\" (UniqueName: \"kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp\") pod \"nova-cell1-db-create-865ck\" (UID: \"6d076331-4162-4833-a443-d5060885eaa3\") " pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.407773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbtp\" (UniqueName: \"kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp\") pod \"nova-cell1-db-create-865ck\" (UID: \"6d076331-4162-4833-a443-d5060885eaa3\") " pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.423453 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbtp\" (UniqueName: \"kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp\") pod \"nova-cell1-db-create-865ck\" (UID: \"6d076331-4162-4833-a443-d5060885eaa3\") " pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.496584 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:18 crc kubenswrapper[4810]: I1003 07:17:18.707370 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.683710 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dth2s" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.834780 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84wnv\" (UniqueName: \"kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835119 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835199 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835216 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835235 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835269 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data\") pod \"4af26805-7114-468f-90a0-20e603b0d9a3\" (UID: \"4af26805-7114-468f-90a0-20e603b0d9a3\") " Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835420 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.835590 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4af26805-7114-468f-90a0-20e603b0d9a3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.841986 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv" (OuterVolumeSpecName: "kube-api-access-84wnv") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "kube-api-access-84wnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.847219 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts" (OuterVolumeSpecName: "scripts") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.849288 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.879837 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.919457 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data" (OuterVolumeSpecName: "config-data") pod "4af26805-7114-468f-90a0-20e603b0d9a3" (UID: "4af26805-7114-468f-90a0-20e603b0d9a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.937618 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.937655 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.937665 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.937676 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84wnv\" (UniqueName: \"kubernetes.io/projected/4af26805-7114-468f-90a0-20e603b0d9a3-kube-api-access-84wnv\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:19 crc kubenswrapper[4810]: I1003 07:17:19.937686 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4af26805-7114-468f-90a0-20e603b0d9a3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:19.999427 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.038523 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs\") pod \"112c7019-188f-47ab-bf5f-a15fddf701eb\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.038729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle\") pod \"112c7019-188f-47ab-bf5f-a15fddf701eb\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.038776 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom\") pod \"112c7019-188f-47ab-bf5f-a15fddf701eb\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.038856 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data\") pod \"112c7019-188f-47ab-bf5f-a15fddf701eb\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.038935 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbg4z\" (UniqueName: \"kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z\") pod \"112c7019-188f-47ab-bf5f-a15fddf701eb\" (UID: \"112c7019-188f-47ab-bf5f-a15fddf701eb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.039009 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs" (OuterVolumeSpecName: "logs") pod "112c7019-188f-47ab-bf5f-a15fddf701eb" (UID: "112c7019-188f-47ab-bf5f-a15fddf701eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.039435 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/112c7019-188f-47ab-bf5f-a15fddf701eb-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.053137 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z" (OuterVolumeSpecName: "kube-api-access-rbg4z") pod "112c7019-188f-47ab-bf5f-a15fddf701eb" (UID: "112c7019-188f-47ab-bf5f-a15fddf701eb"). InnerVolumeSpecName "kube-api-access-rbg4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.053689 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "112c7019-188f-47ab-bf5f-a15fddf701eb" (UID: "112c7019-188f-47ab-bf5f-a15fddf701eb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.071286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764d46d444-sxpld" event={"ID":"112c7019-188f-47ab-bf5f-a15fddf701eb","Type":"ContainerDied","Data":"ead594214a97be9a0b22dd5beba058969ceb88a8e4bce54f040b309051cda3d7"} Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.071341 4810 scope.go:117] "RemoveContainer" containerID="756b6d336d0d23d3125897ab180755e567052e3baa802bfff2e486d6e8caf1be" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.071471 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764d46d444-sxpld" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.083016 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "112c7019-188f-47ab-bf5f-a15fddf701eb" (UID: "112c7019-188f-47ab-bf5f-a15fddf701eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.083552 4810 generic.go:334] "Generic (PLEG): container finished" podID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerID="0c95f1834c0b986a889fd50e7f4669a436ddbc7e4054263b236d8a8a96273957" exitCode=0 Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.083622 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerDied","Data":"0c95f1834c0b986a889fd50e7f4669a436ddbc7e4054263b236d8a8a96273957"} Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.084847 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dth2s" event={"ID":"4af26805-7114-468f-90a0-20e603b0d9a3","Type":"ContainerDied","Data":"3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344"} Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.084870 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cf41d950943f6774c3d6fdc1898c9dc64fd1b442a8ef9e474d7ed4f0febe344" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.084933 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dth2s" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.091558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b3a80a81-d436-4fa8-b5b7-560348449df3","Type":"ContainerStarted","Data":"b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf"} Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.123513 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.048083475 podStartE2EDuration="21.12349307s" podCreationTimestamp="2025-10-03 07:16:59 +0000 UTC" firstStartedPulling="2025-10-03 07:17:00.586672431 +0000 UTC m=+1254.013923176" lastFinishedPulling="2025-10-03 07:17:19.662082036 +0000 UTC m=+1273.089332771" observedRunningTime="2025-10-03 07:17:20.113194744 +0000 UTC m=+1273.540445479" watchObservedRunningTime="2025-10-03 07:17:20.12349307 +0000 UTC m=+1273.550743805" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.149210 4810 scope.go:117] "RemoveContainer" containerID="8860e0aa7676e2d97867298d2f6abc0271642fceb46d2405a1131cd31d8eaa7a" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.153704 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbg4z\" (UniqueName: \"kubernetes.io/projected/112c7019-188f-47ab-bf5f-a15fddf701eb-kube-api-access-rbg4z\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.153742 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.153769 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.157788 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.191580 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data" (OuterVolumeSpecName: "config-data") pod "112c7019-188f-47ab-bf5f-a15fddf701eb" (UID: "112c7019-188f-47ab-bf5f-a15fddf701eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.231847 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.233859 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.254960 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2fs5\" (UniqueName: \"kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255026 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255086 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255109 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255132 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255174 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255223 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbc2s\" (UniqueName: \"kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255244 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255264 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255310 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255350 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d9pl\" (UniqueName: \"kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255384 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml\") pod \"94390f78-7a32-4961-9567-0f2ddbced7cb\" (UID: \"94390f78-7a32-4961-9567-0f2ddbced7cb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255438 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255459 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255484 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data\") pod \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\" (UID: \"94d94cc8-3a85-4e38-b248-4b7809a9d3fb\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255742 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.255795 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs\") pod \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\" (UID: \"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc\") " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.256261 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/112c7019-188f-47ab-bf5f-a15fddf701eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.256251 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.257533 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs" (OuterVolumeSpecName: "logs") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.263730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs" (OuterVolumeSpecName: "logs") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.264035 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.264065 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.267688 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.269118 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s" (OuterVolumeSpecName: "kube-api-access-jbc2s") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "kube-api-access-jbc2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.270603 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts" (OuterVolumeSpecName: "scripts") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.271447 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.271545 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5" (OuterVolumeSpecName: "kube-api-access-m2fs5") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "kube-api-access-m2fs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.271583 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl" (OuterVolumeSpecName: "kube-api-access-4d9pl") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "kube-api-access-4d9pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.275997 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.279868 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts" (OuterVolumeSpecName: "scripts") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.293400 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts" (OuterVolumeSpecName: "scripts") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.310439 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.333315 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.335772 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.342852 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357611 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357652 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2fs5\" (UniqueName: \"kubernetes.io/projected/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-kube-api-access-m2fs5\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357666 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357677 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357693 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357718 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357731 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/94390f78-7a32-4961-9567-0f2ddbced7cb-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357743 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357754 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357766 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbc2s\" (UniqueName: \"kubernetes.io/projected/94390f78-7a32-4961-9567-0f2ddbced7cb-kube-api-access-jbc2s\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357777 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357788 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357799 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d9pl\" (UniqueName: \"kubernetes.io/projected/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-kube-api-access-4d9pl\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357810 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357831 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357844 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357855 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.357865 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.361921 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.378738 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data" (OuterVolumeSpecName: "config-data") pod "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" (UID: "6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.385249 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.388594 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.395820 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.397933 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data" (OuterVolumeSpecName: "config-data") pod "94d94cc8-3a85-4e38-b248-4b7809a9d3fb" (UID: "94d94cc8-3a85-4e38-b248-4b7809a9d3fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.407079 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.414617 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-764d46d444-sxpld"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460793 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460820 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460829 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460841 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460850 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94d94cc8-3a85-4e38-b248-4b7809a9d3fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.460858 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.462063 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data" (OuterVolumeSpecName: "config-data") pod "94390f78-7a32-4961-9567-0f2ddbced7cb" (UID: "94390f78-7a32-4961-9567-0f2ddbced7cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.507002 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-865ck"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.518338 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-t2cwf"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.562346 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94390f78-7a32-4961-9567-0f2ddbced7cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.583797 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9kkxh"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922048 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922502 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922544 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-central-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922553 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-central-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922570 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="proxy-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922577 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="proxy-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922588 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922594 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922607 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-notification-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922616 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-notification-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922629 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922636 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922653 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922660 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api-log" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922672 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922678 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922692 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="sg-core" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922699 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="sg-core" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922713 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922720 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: E1003 07:17:20.922741 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" containerName="cinder-db-sync" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922750 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" containerName="cinder-db-sync" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922974 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="sg-core" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.922992 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="proxy-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923007 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923018 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923028 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-notification-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923040 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" containerName="cinder-db-sync" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923050 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" containerName="ceilometer-central-agent" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923061 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923075 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" containerName="glance-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923091 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" containerName="glance-httpd" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.923104 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" containerName="barbican-api-log" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.924291 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.926429 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.926847 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zm2x4" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.927019 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.927136 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.935123 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.959738 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.965823 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.967202 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.967245 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.981029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.981168 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psczn\" (UniqueName: \"kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.981337 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:20 crc kubenswrapper[4810]: I1003 07:17:20.981509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.015503 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.083575 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.086655 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.086778 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.086827 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.086934 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8fc8\" (UniqueName: \"kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.086941 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psczn\" (UniqueName: \"kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087261 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087378 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087450 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087507 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.087754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.099996 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.100964 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.101181 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.106675 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.114396 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psczn\" (UniqueName: \"kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn\") pod \"cinder-scheduler-0\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.126295 4810 generic.go:334] "Generic (PLEG): container finished" podID="6d076331-4162-4833-a443-d5060885eaa3" containerID="48e3bcb0a0eff2f23d7e18286ed77d99da1068ce70fea65ac4596f9737bd9bb9" exitCode=0 Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.126373 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-865ck" event={"ID":"6d076331-4162-4833-a443-d5060885eaa3","Type":"ContainerDied","Data":"48e3bcb0a0eff2f23d7e18286ed77d99da1068ce70fea65ac4596f9737bd9bb9"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.126399 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-865ck" event={"ID":"6d076331-4162-4833-a443-d5060885eaa3","Type":"ContainerStarted","Data":"70b1606223bd8a5dddfd28143d51a7ddba54d9e22aeaf91dd73e0e4f3769fe4b"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.132190 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.134741 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.138422 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"94390f78-7a32-4961-9567-0f2ddbced7cb","Type":"ContainerDied","Data":"05c6a5551b75ae8c2772473b53f79cea8f0ea77d230dd5a902e1d139267bb3ca"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.142206 4810 scope.go:117] "RemoveContainer" containerID="f31b0e45e6bc358ad5e97e631e4e62b1301370888dd93727bba7a6cef64bc41c" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.138520 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.143045 4810 generic.go:334] "Generic (PLEG): container finished" podID="384a7a09-09bc-4322-b41c-073344b351ef" containerID="a5552a4ebe7248adad1514badc14a89d193f6d57a51f267e35d82e36a87c0a3c" exitCode=0 Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.143072 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kkxh" event={"ID":"384a7a09-09bc-4322-b41c-073344b351ef","Type":"ContainerDied","Data":"a5552a4ebe7248adad1514badc14a89d193f6d57a51f267e35d82e36a87c0a3c"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.146101 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kkxh" event={"ID":"384a7a09-09bc-4322-b41c-073344b351ef","Type":"ContainerStarted","Data":"3416f061b6af9db99bf43c72e6309439a909a437402a4e289cdcaaf635b19b2d"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.148828 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.151010 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc","Type":"ContainerDied","Data":"f91d249919f132249c4bce7d4919cafc80e026839db412676f039cdae9db559f"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.151290 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.160866 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.171315 4810 generic.go:334] "Generic (PLEG): container finished" podID="f135c1f3-8f57-4fba-b0a3-b83ef524acfd" containerID="a949b96df00f2d763ee4f949894529763e305ba03a38a90d148cf3ea09c09c2e" exitCode=0 Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.171527 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2cwf" event={"ID":"f135c1f3-8f57-4fba-b0a3-b83ef524acfd","Type":"ContainerDied","Data":"a949b96df00f2d763ee4f949894529763e305ba03a38a90d148cf3ea09c09c2e"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.171609 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2cwf" event={"ID":"f135c1f3-8f57-4fba-b0a3-b83ef524acfd","Type":"ContainerStarted","Data":"0e1f513fd503db24ced930c6b4850c95c635a727ba90a9cc2d10e24f7dafa42c"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.177601 4810 generic.go:334] "Generic (PLEG): container finished" podID="b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" containerID="9aed72a1383db33d14bca460e68cc2d23b09efa0d6304ea25bf268fbd94fc76f" exitCode=0 Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.177671 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ds94g" event={"ID":"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6","Type":"ContainerDied","Data":"9aed72a1383db33d14bca460e68cc2d23b09efa0d6304ea25bf268fbd94fc76f"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.183438 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"94d94cc8-3a85-4e38-b248-4b7809a9d3fb","Type":"ContainerDied","Data":"5cde710dea441b447f0574eac0c9bed656d5494ca3ecea35c182c2590d8ea855"} Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.183562 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.187947 4810 scope.go:117] "RemoveContainer" containerID="9a534f1171d5b91c7f851d206340aac873299744de307830b38e5f24547a32c2" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188799 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188820 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188844 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188861 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwl8f\" (UniqueName: \"kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188923 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188940 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188958 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.188979 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189053 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189074 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189091 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8fc8\" (UniqueName: \"kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189679 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.189940 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.191576 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.193876 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.194282 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.206529 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8fc8\" (UniqueName: \"kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8\") pod \"dnsmasq-dns-8d5bc587c-q7vgj\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.222095 4810 scope.go:117] "RemoveContainer" containerID="d2644bca4e142921bdd5cc1ab34ada83cf13b6bef815c4417356086919802559" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.239805 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.257024 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.267457 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.268119 4810 scope.go:117] "RemoveContainer" containerID="80359cc6ebe5f16113bcbee03d6448ea22b034de859792d00bde22bffc5d05e6" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.270780 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.281743 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.289651 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.291481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.293226 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ktz64" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.294163 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.294444 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299612 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299737 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299760 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwl8f\" (UniqueName: \"kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299785 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.299809 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.300003 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.300017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.302094 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.302715 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.304133 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.304766 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.307110 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.318330 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.323988 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwl8f\" (UniqueName: \"kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f\") pod \"cinder-api-0\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.326408 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112c7019-188f-47ab-bf5f-a15fddf701eb" path="/var/lib/kubelet/pods/112c7019-188f-47ab-bf5f-a15fddf701eb/volumes" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.327159 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94390f78-7a32-4961-9567-0f2ddbced7cb" path="/var/lib/kubelet/pods/94390f78-7a32-4961-9567-0f2ddbced7cb/volumes" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.329496 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d94cc8-3a85-4e38-b248-4b7809a9d3fb" path="/var/lib/kubelet/pods/94d94cc8-3a85-4e38-b248-4b7809a9d3fb/volumes" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.330302 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.336230 4810 scope.go:117] "RemoveContainer" containerID="0c95f1834c0b986a889fd50e7f4669a436ddbc7e4054263b236d8a8a96273957" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.348038 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.352853 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.354752 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.355787 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.355976 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.362945 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.364246 4810 scope.go:117] "RemoveContainer" containerID="c742a0eb9bec9574678fd9f89424f9537fd85ffd8b31d33876ce03b40e6c0164" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.372577 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.383401 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.385142 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.393541 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.414373 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417327 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417385 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417415 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417443 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417523 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417544 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417602 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417635 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417654 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64m2h\" (UniqueName: \"kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417706 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417754 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pswk2\" (UniqueName: \"kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.417775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.449092 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.465110 4810 scope.go:117] "RemoveContainer" containerID="9ac4fcfc9f7cc315241fd9f62b9064b3b6483a02aae6e95db87cbafb1e876e0c" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.488524 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520375 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520440 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520498 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520572 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64m2h\" (UniqueName: \"kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520924 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520956 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.520971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521039 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pswk2\" (UniqueName: \"kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521068 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521103 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521169 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521341 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jbk\" (UniqueName: \"kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521379 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521404 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521444 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521482 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521626 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521653 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521792 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.521165 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.527433 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.528399 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.533633 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.542678 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.542842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.543072 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.547532 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pswk2\" (UniqueName: \"kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.551937 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64m2h\" (UniqueName: \"kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.560391 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.561275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.561640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.561664 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts\") pod \"ceilometer-0\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.562187 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.596250 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.622912 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.622984 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623008 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623041 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623064 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623094 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623148 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jbk\" (UniqueName: \"kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623174 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.623476 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.626097 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.627804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.628696 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.629498 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.646051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.652906 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.655839 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.662122 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.662989 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.681909 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.691459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jbk\" (UniqueName: \"kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.706708 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.749580 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.807841 4810 scope.go:117] "RemoveContainer" containerID="4fc1312223f6be73f6093c9c399d707a59a6c14f3f19c960d1f5dfc25309cff7" Oct 03 07:17:21 crc kubenswrapper[4810]: I1003 07:17:21.814845 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:21 crc kubenswrapper[4810]: W1003 07:17:21.828220 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90792846_fe89_4337_8965_a6d8b372b80c.slice/crio-359c7be9877d822de364923b37b61e041a31941aa025b30cd07223955a4ddb45 WatchSource:0}: Error finding container 359c7be9877d822de364923b37b61e041a31941aa025b30cd07223955a4ddb45: Status 404 returned error can't find the container with id 359c7be9877d822de364923b37b61e041a31941aa025b30cd07223955a4ddb45 Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.060098 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.150046 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.193869 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" event={"ID":"36444f1e-ead6-4ff4-be56-7fd9c72904f7","Type":"ContainerStarted","Data":"fe4594e313f7cb67ed2530b4d61185a016b5b45f406855837b3755c1b4d7369f"} Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.200038 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerStarted","Data":"d5f5f89407135eaf75f77dbb77b52a74e90940fe654ad827d266cc9a23a6c074"} Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.229211 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerStarted","Data":"359c7be9877d822de364923b37b61e041a31941aa025b30cd07223955a4ddb45"} Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.403106 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.539758 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.626327 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.853734 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.865839 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.990680 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstzx\" (UniqueName: \"kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx\") pod \"f135c1f3-8f57-4fba-b0a3-b83ef524acfd\" (UID: \"f135c1f3-8f57-4fba-b0a3-b83ef524acfd\") " Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.991197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjbtp\" (UniqueName: \"kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp\") pod \"6d076331-4162-4833-a443-d5060885eaa3\" (UID: \"6d076331-4162-4833-a443-d5060885eaa3\") " Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.995292 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp" (OuterVolumeSpecName: "kube-api-access-pjbtp") pod "6d076331-4162-4833-a443-d5060885eaa3" (UID: "6d076331-4162-4833-a443-d5060885eaa3"). InnerVolumeSpecName "kube-api-access-pjbtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:22 crc kubenswrapper[4810]: I1003 07:17:22.995552 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx" (OuterVolumeSpecName: "kube-api-access-bstzx") pod "f135c1f3-8f57-4fba-b0a3-b83ef524acfd" (UID: "f135c1f3-8f57-4fba-b0a3-b83ef524acfd"). InnerVolumeSpecName "kube-api-access-bstzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.006223 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ds94g" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.023180 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.094332 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config\") pod \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.097967 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f468p\" (UniqueName: \"kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p\") pod \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.098127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle\") pod \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\" (UID: \"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6\") " Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.098190 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbj5\" (UniqueName: \"kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5\") pod \"384a7a09-09bc-4322-b41c-073344b351ef\" (UID: \"384a7a09-09bc-4322-b41c-073344b351ef\") " Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.098920 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjbtp\" (UniqueName: \"kubernetes.io/projected/6d076331-4162-4833-a443-d5060885eaa3-kube-api-access-pjbtp\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.098948 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstzx\" (UniqueName: \"kubernetes.io/projected/f135c1f3-8f57-4fba-b0a3-b83ef524acfd-kube-api-access-bstzx\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.101691 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p" (OuterVolumeSpecName: "kube-api-access-f468p") pod "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" (UID: "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6"). InnerVolumeSpecName "kube-api-access-f468p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.106030 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5" (OuterVolumeSpecName: "kube-api-access-7lbj5") pod "384a7a09-09bc-4322-b41c-073344b351ef" (UID: "384a7a09-09bc-4322-b41c-073344b351ef"). InnerVolumeSpecName "kube-api-access-7lbj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.132980 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config" (OuterVolumeSpecName: "config") pod "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" (UID: "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.143248 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" (UID: "b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.200638 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.200666 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f468p\" (UniqueName: \"kubernetes.io/projected/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-kube-api-access-f468p\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.200676 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.200685 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lbj5\" (UniqueName: \"kubernetes.io/projected/384a7a09-09bc-4322-b41c-073344b351ef-kube-api-access-7lbj5\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.258796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerStarted","Data":"d56da9519fd119b5258df6c886b3770dc7f003fbe71251b3bdf7a752ed0c8f5d"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.263280 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerStarted","Data":"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.265665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ds94g" event={"ID":"b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6","Type":"ContainerDied","Data":"76c744ec03724a39826653c7bc3000e5a4e199f1050aedf7740dbf5fc275b9d5"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.265703 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76c744ec03724a39826653c7bc3000e5a4e199f1050aedf7740dbf5fc275b9d5" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.265750 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ds94g" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.282699 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-865ck" event={"ID":"6d076331-4162-4833-a443-d5060885eaa3","Type":"ContainerDied","Data":"70b1606223bd8a5dddfd28143d51a7ddba54d9e22aeaf91dd73e0e4f3769fe4b"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.282773 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70b1606223bd8a5dddfd28143d51a7ddba54d9e22aeaf91dd73e0e4f3769fe4b" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.282861 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-865ck" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.298868 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerStarted","Data":"8107a8ca0dc1964903e6d4447c4b3ba556a89ff5df655c7f3c037c5fa503a779"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.315122 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9kkxh" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.340981 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc" path="/var/lib/kubelet/pods/6569e1c4-c0f7-4146-9ba4-1bb855dfd9fc/volumes" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.342374 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9kkxh" event={"ID":"384a7a09-09bc-4322-b41c-073344b351ef","Type":"ContainerDied","Data":"3416f061b6af9db99bf43c72e6309439a909a437402a4e289cdcaaf635b19b2d"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.342404 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3416f061b6af9db99bf43c72e6309439a909a437402a4e289cdcaaf635b19b2d" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.342416 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerStarted","Data":"81a32e631a4f071d87a3476dbadbd1e71bead6de90d3816abfce65d4a3429596"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.430425 4810 generic.go:334] "Generic (PLEG): container finished" podID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerID="516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9" exitCode=0 Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.430748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" event={"ID":"36444f1e-ead6-4ff4-be56-7fd9c72904f7","Type":"ContainerDied","Data":"516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.477566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-t2cwf" event={"ID":"f135c1f3-8f57-4fba-b0a3-b83ef524acfd","Type":"ContainerDied","Data":"0e1f513fd503db24ced930c6b4850c95c635a727ba90a9cc2d10e24f7dafa42c"} Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.477607 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e1f513fd503db24ced930c6b4850c95c635a727ba90a9cc2d10e24f7dafa42c" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.477704 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-t2cwf" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.620509 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.742380 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:17:23 crc kubenswrapper[4810]: E1003 07:17:23.742817 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f135c1f3-8f57-4fba-b0a3-b83ef524acfd" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.742837 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f135c1f3-8f57-4fba-b0a3-b83ef524acfd" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: E1003 07:17:23.742850 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="384a7a09-09bc-4322-b41c-073344b351ef" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.742856 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="384a7a09-09bc-4322-b41c-073344b351ef" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: E1003 07:17:23.742868 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" containerName="neutron-db-sync" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.742874 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" containerName="neutron-db-sync" Oct 03 07:17:23 crc kubenswrapper[4810]: E1003 07:17:23.742886 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d076331-4162-4833-a443-d5060885eaa3" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.742907 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d076331-4162-4833-a443-d5060885eaa3" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.743090 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" containerName="neutron-db-sync" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.743105 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f135c1f3-8f57-4fba-b0a3-b83ef524acfd" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.743120 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d076331-4162-4833-a443-d5060885eaa3" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.743132 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="384a7a09-09bc-4322-b41c-073344b351ef" containerName="mariadb-database-create" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.744371 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.747905 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.748063 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.748146 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.748377 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-lkp4k" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.751362 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.752903 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.758136 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.770061 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859428 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc2pl\" (UniqueName: \"kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859477 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjpn\" (UniqueName: \"kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859681 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859827 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.859849 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.860072 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.860113 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.860299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.860432 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.961931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.961981 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962036 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962094 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc2pl\" (UniqueName: \"kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962113 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962151 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjpn\" (UniqueName: \"kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962170 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962188 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962215 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.962233 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.963010 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.964006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.964218 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.964866 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.967102 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.968402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.969428 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.970665 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.981682 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.986764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjpn\" (UniqueName: \"kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn\") pod \"neutron-c56f97996-hbltc\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:23 crc kubenswrapper[4810]: I1003 07:17:23.988114 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc2pl\" (UniqueName: \"kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl\") pod \"dnsmasq-dns-5f7ddd66c7-gl95v\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.110996 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.123039 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.372094 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.505071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerStarted","Data":"09c620dc8fe68cac670d9ae669f099323e4cea23f3ddbfa4a9ad6d121c674e43"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.510419 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerStarted","Data":"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.526558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerStarted","Data":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.526601 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerStarted","Data":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.532492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerStarted","Data":"61b1ebe508d4c4ad6781c304ef58a02e24a10dd6ba9c40650db17b0f431011ec"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.542673 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="dnsmasq-dns" containerID="cri-o://797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4" gracePeriod=10 Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.544218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" event={"ID":"36444f1e-ead6-4ff4-be56-7fd9c72904f7","Type":"ContainerStarted","Data":"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4"} Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.544485 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.819104 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" podStartSLOduration=4.819087335 podStartE2EDuration="4.819087335s" podCreationTimestamp="2025-10-03 07:17:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:24.578233682 +0000 UTC m=+1278.005484427" watchObservedRunningTime="2025-10-03 07:17:24.819087335 +0000 UTC m=+1278.246338070" Oct 03 07:17:24 crc kubenswrapper[4810]: I1003 07:17:24.824517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:17:24 crc kubenswrapper[4810]: W1003 07:17:24.863342 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2768fe43_714c_4ffb_b075_c64d36287348.slice/crio-f3c0a5967639e1dbed1de860844021067cd57e22c2202ea84dfb503ef6b40356 WatchSource:0}: Error finding container f3c0a5967639e1dbed1de860844021067cd57e22c2202ea84dfb503ef6b40356: Status 404 returned error can't find the container with id f3c0a5967639e1dbed1de860844021067cd57e22c2202ea84dfb503ef6b40356 Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.026134 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.205940 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314267 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8fc8\" (UniqueName: \"kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314327 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314377 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314421 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314573 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.314689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb\") pod \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\" (UID: \"36444f1e-ead6-4ff4-be56-7fd9c72904f7\") " Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.328130 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8" (OuterVolumeSpecName: "kube-api-access-d8fc8") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "kube-api-access-d8fc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.416923 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8fc8\" (UniqueName: \"kubernetes.io/projected/36444f1e-ead6-4ff4-be56-7fd9c72904f7-kube-api-access-d8fc8\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.442683 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.447380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.481714 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.495851 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.520676 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.520713 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.520724 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.520734 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.535318 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config" (OuterVolumeSpecName: "config") pod "36444f1e-ead6-4ff4-be56-7fd9c72904f7" (UID: "36444f1e-ead6-4ff4-be56-7fd9c72904f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.587452 4810 generic.go:334] "Generic (PLEG): container finished" podID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerID="797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4" exitCode=0 Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.587539 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" event={"ID":"36444f1e-ead6-4ff4-be56-7fd9c72904f7","Type":"ContainerDied","Data":"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.587567 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" event={"ID":"36444f1e-ead6-4ff4-be56-7fd9c72904f7","Type":"ContainerDied","Data":"fe4594e313f7cb67ed2530b4d61185a016b5b45f406855837b3755c1b4d7369f"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.587583 4810 scope.go:117] "RemoveContainer" containerID="797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.587715 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8d5bc587c-q7vgj" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.599254 4810 generic.go:334] "Generic (PLEG): container finished" podID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerID="ed916efd4e80a6cf41b59abf0b2e8e60cf3f2dd6f2963d367638138bc4d8bce5" exitCode=0 Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.600197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" event={"ID":"9fec4478-2501-4fbe-90f2-05b2f239c265","Type":"ContainerDied","Data":"ed916efd4e80a6cf41b59abf0b2e8e60cf3f2dd6f2963d367638138bc4d8bce5"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.600231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" event={"ID":"9fec4478-2501-4fbe-90f2-05b2f239c265","Type":"ContainerStarted","Data":"9570473a79cb996d338ab1fc31d7e02e09ceaa566a0882ba24761715efeff89a"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.614454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerStarted","Data":"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.614788 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api-log" containerID="cri-o://59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" gracePeriod=30 Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.614976 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.615087 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api" containerID="cri-o://2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" gracePeriod=30 Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.623400 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36444f1e-ead6-4ff4-be56-7fd9c72904f7-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.635388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerStarted","Data":"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.649797 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerStarted","Data":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.653044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerStarted","Data":"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.653233 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerStarted","Data":"f3c0a5967639e1dbed1de860844021067cd57e22c2202ea84dfb503ef6b40356"} Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.660220 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.660198203 podStartE2EDuration="4.660198203s" podCreationTimestamp="2025-10-03 07:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:25.64258089 +0000 UTC m=+1279.069831625" watchObservedRunningTime="2025-10-03 07:17:25.660198203 +0000 UTC m=+1279.087448948" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.670141 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.661067621 podStartE2EDuration="5.670123399s" podCreationTimestamp="2025-10-03 07:17:20 +0000 UTC" firstStartedPulling="2025-10-03 07:17:21.84253752 +0000 UTC m=+1275.269788255" lastFinishedPulling="2025-10-03 07:17:22.851593308 +0000 UTC m=+1276.278844033" observedRunningTime="2025-10-03 07:17:25.669948924 +0000 UTC m=+1279.097199659" watchObservedRunningTime="2025-10-03 07:17:25.670123399 +0000 UTC m=+1279.097374134" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.768606 4810 scope.go:117] "RemoveContainer" containerID="516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.795412 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.805667 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8d5bc587c-q7vgj"] Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.816229 4810 scope.go:117] "RemoveContainer" containerID="797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4" Oct 03 07:17:25 crc kubenswrapper[4810]: E1003 07:17:25.816683 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4\": container with ID starting with 797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4 not found: ID does not exist" containerID="797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.816725 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4"} err="failed to get container status \"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4\": rpc error: code = NotFound desc = could not find container \"797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4\": container with ID starting with 797f3ddfa01f5c4332d9c1e1b7400dd704be11f5844e0fcd15f4ad5a715d09b4 not found: ID does not exist" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.816751 4810 scope.go:117] "RemoveContainer" containerID="516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9" Oct 03 07:17:25 crc kubenswrapper[4810]: E1003 07:17:25.817084 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9\": container with ID starting with 516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9 not found: ID does not exist" containerID="516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9" Oct 03 07:17:25 crc kubenswrapper[4810]: I1003 07:17:25.817129 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9"} err="failed to get container status \"516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9\": rpc error: code = NotFound desc = could not find container \"516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9\": container with ID starting with 516a7be1fe5e6a4f277cb91669211e6ee96703e947becebc3422ed0b24277fe9 not found: ID does not exist" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.268184 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.402417 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.542959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543041 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543182 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543609 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs" (OuterVolumeSpecName: "logs") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543691 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.543769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.548995 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwl8f\" (UniqueName: \"kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f\") pod \"68285f34-2229-4bb5-b71c-acc1d61f691d\" (UID: \"68285f34-2229-4bb5-b71c-acc1d61f691d\") " Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.552540 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts" (OuterVolumeSpecName: "scripts") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.552653 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.552666 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68285f34-2229-4bb5-b71c-acc1d61f691d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.552675 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68285f34-2229-4bb5-b71c-acc1d61f691d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.553378 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.560086 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f" (OuterVolumeSpecName: "kube-api-access-kwl8f") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "kube-api-access-kwl8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.619375 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data" (OuterVolumeSpecName: "config-data") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.637217 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68285f34-2229-4bb5-b71c-acc1d61f691d" (UID: "68285f34-2229-4bb5-b71c-acc1d61f691d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.657986 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.658078 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.658096 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68285f34-2229-4bb5-b71c-acc1d61f691d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.658113 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwl8f\" (UniqueName: \"kubernetes.io/projected/68285f34-2229-4bb5-b71c-acc1d61f691d-kube-api-access-kwl8f\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.669061 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" event={"ID":"9fec4478-2501-4fbe-90f2-05b2f239c265","Type":"ContainerStarted","Data":"d44fefa290c5c27bc60dfb747c4b1bf38228704c17026548965bb1f674d616ee"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.670452 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.672821 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerStarted","Data":"49fef7a7d668fc557a8e17d62bce3a1fd7aea690b4d9b4a867e5da5faeec8731"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676157 4810 generic.go:334] "Generic (PLEG): container finished" podID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerID="2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" exitCode=0 Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676182 4810 generic.go:334] "Generic (PLEG): container finished" podID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerID="59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" exitCode=143 Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676213 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerDied","Data":"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerDied","Data":"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676240 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68285f34-2229-4bb5-b71c-acc1d61f691d","Type":"ContainerDied","Data":"d5f5f89407135eaf75f77dbb77b52a74e90940fe654ad827d266cc9a23a6c074"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676254 4810 scope.go:117] "RemoveContainer" containerID="2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.676328 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.685834 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerStarted","Data":"ba367086c2d5f7cae40c427ef1243c3b1029c3f76d37b715b51d6a5c5d14c6c7"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.688712 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" podStartSLOduration=3.688693031 podStartE2EDuration="3.688693031s" podCreationTimestamp="2025-10-03 07:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:26.686341938 +0000 UTC m=+1280.113592673" watchObservedRunningTime="2025-10-03 07:17:26.688693031 +0000 UTC m=+1280.115943766" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.692744 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerStarted","Data":"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b"} Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.693151 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.734436 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.734415976 podStartE2EDuration="5.734415976s" podCreationTimestamp="2025-10-03 07:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:26.708057039 +0000 UTC m=+1280.135307764" watchObservedRunningTime="2025-10-03 07:17:26.734415976 +0000 UTC m=+1280.161666711" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.739035 4810 scope.go:117] "RemoveContainer" containerID="59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.747067 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.747043834 podStartE2EDuration="5.747043834s" podCreationTimestamp="2025-10-03 07:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:26.737977272 +0000 UTC m=+1280.165228007" watchObservedRunningTime="2025-10-03 07:17:26.747043834 +0000 UTC m=+1280.174294589" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.767474 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.768311 4810 scope.go:117] "RemoveContainer" containerID="2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.769107 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2\": container with ID starting with 2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2 not found: ID does not exist" containerID="2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.769132 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2"} err="failed to get container status \"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2\": rpc error: code = NotFound desc = could not find container \"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2\": container with ID starting with 2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2 not found: ID does not exist" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.769153 4810 scope.go:117] "RemoveContainer" containerID="59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.770371 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e\": container with ID starting with 59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e not found: ID does not exist" containerID="59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.770480 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e"} err="failed to get container status \"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e\": rpc error: code = NotFound desc = could not find container \"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e\": container with ID starting with 59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e not found: ID does not exist" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.770552 4810 scope.go:117] "RemoveContainer" containerID="2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.773347 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2"} err="failed to get container status \"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2\": rpc error: code = NotFound desc = could not find container \"2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2\": container with ID starting with 2f646e80a085a65cb9852e4c7459578331876539f32b84cdfcad7516293be5c2 not found: ID does not exist" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.773531 4810 scope.go:117] "RemoveContainer" containerID="59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.775854 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e"} err="failed to get container status \"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e\": rpc error: code = NotFound desc = could not find container \"59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e\": container with ID starting with 59edc4b42e0d925fe82c58cb1e5ed9b63a92483f44f7ea019c857bdc71704d4e not found: ID does not exist" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.776859 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.784869 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.785266 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="init" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785286 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="init" Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.785319 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api-log" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785328 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api-log" Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.785340 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785348 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api" Oct 03 07:17:26 crc kubenswrapper[4810]: E1003 07:17:26.785364 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="dnsmasq-dns" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785371 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="dnsmasq-dns" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785574 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" containerName="dnsmasq-dns" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785589 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.785611 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" containerName="cinder-api-log" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.786558 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.790986 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-c56f97996-hbltc" podStartSLOduration=3.790963951 podStartE2EDuration="3.790963951s" podCreationTimestamp="2025-10-03 07:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:26.785995008 +0000 UTC m=+1280.213245763" watchObservedRunningTime="2025-10-03 07:17:26.790963951 +0000 UTC m=+1280.218214686" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.792235 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.792507 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.792672 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.817352 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864028 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864069 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6pr\" (UniqueName: \"kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864176 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864191 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864213 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864269 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.864300 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966537 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966573 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966619 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966651 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966682 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966724 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966750 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6pr\" (UniqueName: \"kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.966848 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.967243 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.967572 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.972252 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.975143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.975143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.975333 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.975472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.975555 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:26 crc kubenswrapper[4810]: I1003 07:17:26.985243 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6pr\" (UniqueName: \"kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr\") pod \"cinder-api-0\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " pod="openstack/cinder-api-0" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.106319 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.321337 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36444f1e-ead6-4ff4-be56-7fd9c72904f7" path="/var/lib/kubelet/pods/36444f1e-ead6-4ff4-be56-7fd9c72904f7/volumes" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.323166 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68285f34-2229-4bb5-b71c-acc1d61f691d" path="/var/lib/kubelet/pods/68285f34-2229-4bb5-b71c-acc1d61f691d/volumes" Oct 03 07:17:27 crc kubenswrapper[4810]: W1003 07:17:27.626022 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68aa80a4_2dc5_4bf9_a950_6a612c2e182b.slice/crio-06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c WatchSource:0}: Error finding container 06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c: Status 404 returned error can't find the container with id 06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.637153 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.748639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerStarted","Data":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.750033 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.759966 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerStarted","Data":"06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c"} Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.776371 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.857811508 podStartE2EDuration="6.776338533s" podCreationTimestamp="2025-10-03 07:17:21 +0000 UTC" firstStartedPulling="2025-10-03 07:17:22.448860006 +0000 UTC m=+1275.876110741" lastFinishedPulling="2025-10-03 07:17:26.367387031 +0000 UTC m=+1279.794637766" observedRunningTime="2025-10-03 07:17:27.776009515 +0000 UTC m=+1281.203260250" watchObservedRunningTime="2025-10-03 07:17:27.776338533 +0000 UTC m=+1281.203589268" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.902688 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.905422 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.908231 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.908615 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.926997 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.998721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.998811 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.998927 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.998992 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.999029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.999062 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pltk\" (UniqueName: \"kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:27 crc kubenswrapper[4810]: I1003 07:17:27.999260 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.100839 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.100916 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.100956 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.100982 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.101002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pltk\" (UniqueName: \"kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.102263 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.102419 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.106798 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.116476 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.116489 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.116634 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.118879 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.120688 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.120714 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pltk\" (UniqueName: \"kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk\") pod \"neutron-584cb866f5-5rvp9\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.230483 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.772111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerStarted","Data":"bb77897c0ed3781276a2a824ff0d9bfd38c627747b91d61f75b3403cb0b2103a"} Oct 03 07:17:28 crc kubenswrapper[4810]: I1003 07:17:28.838176 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.839571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerStarted","Data":"61317c73aecc47dae6a6843a4447305cfab5916eb86c694594f67fbaea6a141f"} Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.840208 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.867951 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerStarted","Data":"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a"} Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.867990 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerStarted","Data":"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81"} Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.868000 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerStarted","Data":"7ed755887fa36cdbd6d8edaec5e0f363ffb843387c1fb832fa183893eb1ffdcb"} Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.868915 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:29 crc kubenswrapper[4810]: I1003 07:17:29.897112 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.897093478 podStartE2EDuration="3.897093478s" podCreationTimestamp="2025-10-03 07:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:29.891333514 +0000 UTC m=+1283.318584249" watchObservedRunningTime="2025-10-03 07:17:29.897093478 +0000 UTC m=+1283.324344213" Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.342841 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-584cb866f5-5rvp9" podStartSLOduration=3.342823411 podStartE2EDuration="3.342823411s" podCreationTimestamp="2025-10-03 07:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:29.91582293 +0000 UTC m=+1283.343073655" watchObservedRunningTime="2025-10-03 07:17:30.342823411 +0000 UTC m=+1283.770074146" Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.343655 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.874434 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-central-agent" containerID="cri-o://261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" gracePeriod=30 Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.874860 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="sg-core" containerID="cri-o://1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" gracePeriod=30 Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.874946 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="proxy-httpd" containerID="cri-o://c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" gracePeriod=30 Oct 03 07:17:30 crc kubenswrapper[4810]: I1003 07:17:30.875002 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-notification-agent" containerID="cri-o://cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" gracePeriod=30 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.606907 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.646458 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.646497 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.696307 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.696514 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.701355 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.750176 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.750215 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.783696 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.795123 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.803798 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.880915 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pswk2\" (UniqueName: \"kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.880995 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.881083 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.881207 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.881248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.881322 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.881383 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle\") pod \"18acded5-a21c-4826-902c-d66869ff0cb1\" (UID: \"18acded5-a21c-4826-902c-d66869ff0cb1\") " Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.882680 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.882694 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887256 4810 generic.go:334] "Generic (PLEG): container finished" podID="18acded5-a21c-4826-902c-d66869ff0cb1" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" exitCode=0 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887284 4810 generic.go:334] "Generic (PLEG): container finished" podID="18acded5-a21c-4826-902c-d66869ff0cb1" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" exitCode=2 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887291 4810 generic.go:334] "Generic (PLEG): container finished" podID="18acded5-a21c-4826-902c-d66869ff0cb1" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" exitCode=0 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887300 4810 generic.go:334] "Generic (PLEG): container finished" podID="18acded5-a21c-4826-902c-d66869ff0cb1" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" exitCode=0 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerDied","Data":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887427 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerDied","Data":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerDied","Data":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerDied","Data":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18acded5-a21c-4826-902c-d66869ff0cb1","Type":"ContainerDied","Data":"8107a8ca0dc1964903e6d4447c4b3ba556a89ff5df655c7f3c037c5fa503a779"} Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887482 4810 scope.go:117] "RemoveContainer" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.887616 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.889554 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="cinder-scheduler" containerID="cri-o://874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719" gracePeriod=30 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.889867 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.889930 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.889945 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.889963 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.894489 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2" (OuterVolumeSpecName: "kube-api-access-pswk2") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "kube-api-access-pswk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.900458 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="probe" containerID="cri-o://81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a" gracePeriod=30 Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.900851 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts" (OuterVolumeSpecName: "scripts") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.922010 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.923156 4810 scope.go:117] "RemoveContainer" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.946927 4810 scope.go:117] "RemoveContainer" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.983995 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.984021 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18acded5-a21c-4826-902c-d66869ff0cb1-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.984030 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.984040 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pswk2\" (UniqueName: \"kubernetes.io/projected/18acded5-a21c-4826-902c-d66869ff0cb1-kube-api-access-pswk2\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.984049 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:31 crc kubenswrapper[4810]: I1003 07:17:31.989646 4810 scope.go:117] "RemoveContainer" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.008434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.047066 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data" (OuterVolumeSpecName: "config-data") pod "18acded5-a21c-4826-902c-d66869ff0cb1" (UID: "18acded5-a21c-4826-902c-d66869ff0cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.087112 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.087138 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18acded5-a21c-4826-902c-d66869ff0cb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.089107 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.089148 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.110242 4810 scope.go:117] "RemoveContainer" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.110636 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": container with ID starting with c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980 not found: ID does not exist" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.110679 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} err="failed to get container status \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": rpc error: code = NotFound desc = could not find container \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": container with ID starting with c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.110707 4810 scope.go:117] "RemoveContainer" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.111027 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": container with ID starting with 1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1 not found: ID does not exist" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111053 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} err="failed to get container status \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": rpc error: code = NotFound desc = could not find container \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": container with ID starting with 1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111069 4810 scope.go:117] "RemoveContainer" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.111282 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": container with ID starting with cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064 not found: ID does not exist" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111310 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} err="failed to get container status \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": rpc error: code = NotFound desc = could not find container \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": container with ID starting with cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111323 4810 scope.go:117] "RemoveContainer" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.111538 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": container with ID starting with 261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee not found: ID does not exist" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111562 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} err="failed to get container status \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": rpc error: code = NotFound desc = could not find container \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": container with ID starting with 261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111577 4810 scope.go:117] "RemoveContainer" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111794 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} err="failed to get container status \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": rpc error: code = NotFound desc = could not find container \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": container with ID starting with c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.111818 4810 scope.go:117] "RemoveContainer" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112095 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} err="failed to get container status \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": rpc error: code = NotFound desc = could not find container \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": container with ID starting with 1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112120 4810 scope.go:117] "RemoveContainer" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112337 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} err="failed to get container status \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": rpc error: code = NotFound desc = could not find container \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": container with ID starting with cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112363 4810 scope.go:117] "RemoveContainer" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112593 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} err="failed to get container status \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": rpc error: code = NotFound desc = could not find container \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": container with ID starting with 261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112613 4810 scope.go:117] "RemoveContainer" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112817 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} err="failed to get container status \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": rpc error: code = NotFound desc = could not find container \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": container with ID starting with c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.112839 4810 scope.go:117] "RemoveContainer" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113062 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} err="failed to get container status \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": rpc error: code = NotFound desc = could not find container \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": container with ID starting with 1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113084 4810 scope.go:117] "RemoveContainer" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113279 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} err="failed to get container status \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": rpc error: code = NotFound desc = could not find container \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": container with ID starting with cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113309 4810 scope.go:117] "RemoveContainer" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113528 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} err="failed to get container status \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": rpc error: code = NotFound desc = could not find container \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": container with ID starting with 261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113548 4810 scope.go:117] "RemoveContainer" containerID="c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113758 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980"} err="failed to get container status \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": rpc error: code = NotFound desc = could not find container \"c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980\": container with ID starting with c65e6740ff116e01e7f981ce8455ba515fd0c92634049af88ba32208c5ef8980 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113779 4810 scope.go:117] "RemoveContainer" containerID="1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.113988 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1"} err="failed to get container status \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": rpc error: code = NotFound desc = could not find container \"1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1\": container with ID starting with 1ddac6a0c27496de743e46644ffc635ade0d1c8c0125267f2e66227afeb478d1 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.114007 4810 scope.go:117] "RemoveContainer" containerID="cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.114205 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064"} err="failed to get container status \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": rpc error: code = NotFound desc = could not find container \"cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064\": container with ID starting with cdfb8a8f792aa8bbd87a47db494499c1e5f594273724e0e82f61dddf80c7f064 not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.114237 4810 scope.go:117] "RemoveContainer" containerID="261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.114476 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee"} err="failed to get container status \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": rpc error: code = NotFound desc = could not find container \"261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee\": container with ID starting with 261f063eb67f2a8d2b51bb09e36aad48b194a7de00cb5147433318707eea87ee not found: ID does not exist" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.240974 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.249754 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.256697 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.257099 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-central-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257116 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-central-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.257129 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="sg-core" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257137 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="sg-core" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.257148 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-notification-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257179 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-notification-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: E1003 07:17:32.257208 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="proxy-httpd" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257215 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="proxy-httpd" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257401 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-notification-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257418 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="proxy-httpd" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257428 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="sg-core" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.257442 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" containerName="ceilometer-central-agent" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.258992 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.261412 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.261625 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.271249 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392066 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwm5l\" (UniqueName: \"kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392250 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392286 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392314 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.392342 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494040 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwm5l\" (UniqueName: \"kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494336 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494399 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494445 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.494847 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.495165 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.498389 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.498395 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.502770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.508665 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.512772 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwm5l\" (UniqueName: \"kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l\") pod \"ceilometer-0\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.582777 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.897428 4810 generic.go:334] "Generic (PLEG): container finished" podID="90792846-fe89-4337-8965-a6d8b372b80c" containerID="81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a" exitCode=0 Oct 03 07:17:32 crc kubenswrapper[4810]: I1003 07:17:32.897513 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerDied","Data":"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a"} Oct 03 07:17:33 crc kubenswrapper[4810]: W1003 07:17:33.124959 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaf3b5d8_eee9_499b_bc32_d184170a8162.slice/crio-9ce2134602c05df897f61a8c9bcb3287397c598988a482237255c3b97a5866e8 WatchSource:0}: Error finding container 9ce2134602c05df897f61a8c9bcb3287397c598988a482237255c3b97a5866e8: Status 404 returned error can't find the container with id 9ce2134602c05df897f61a8c9bcb3287397c598988a482237255c3b97a5866e8 Oct 03 07:17:33 crc kubenswrapper[4810]: I1003 07:17:33.129715 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:33 crc kubenswrapper[4810]: I1003 07:17:33.314918 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18acded5-a21c-4826-902c-d66869ff0cb1" path="/var/lib/kubelet/pods/18acded5-a21c-4826-902c-d66869ff0cb1/volumes" Oct 03 07:17:33 crc kubenswrapper[4810]: I1003 07:17:33.913209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerStarted","Data":"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0"} Oct 03 07:17:33 crc kubenswrapper[4810]: I1003 07:17:33.913572 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerStarted","Data":"9ce2134602c05df897f61a8c9bcb3287397c598988a482237255c3b97a5866e8"} Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.005370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.005471 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.030224 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.030285 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.030382 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.039438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.125036 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.156234 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.217699 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.218078 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="dnsmasq-dns" containerID="cri-o://0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2" gracePeriod=10 Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.797091 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.933168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerStarted","Data":"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0"} Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.935704 4810 generic.go:334] "Generic (PLEG): container finished" podID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerID="0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2" exitCode=0 Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.935766 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.935775 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" event={"ID":"d446e6af-e752-4c4d-868a-414c0bfd0104","Type":"ContainerDied","Data":"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2"} Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.935838 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b6bb6597-jtfhd" event={"ID":"d446e6af-e752-4c4d-868a-414c0bfd0104","Type":"ContainerDied","Data":"0d1eb6ebc941d2b6bfc30b0695b4b7d194e0a60e4a18b2b22b4c9a2129ce6d49"} Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.935858 4810 scope.go:117] "RemoveContainer" containerID="0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4qrn\" (UniqueName: \"kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947337 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947368 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947501 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947629 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.947651 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc\") pod \"d446e6af-e752-4c4d-868a-414c0bfd0104\" (UID: \"d446e6af-e752-4c4d-868a-414c0bfd0104\") " Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.952706 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn" (OuterVolumeSpecName: "kube-api-access-d4qrn") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "kube-api-access-d4qrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:34 crc kubenswrapper[4810]: I1003 07:17:34.960829 4810 scope.go:117] "RemoveContainer" containerID="92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.000522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.006370 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.006550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.013205 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.014283 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config" (OuterVolumeSpecName: "config") pod "d446e6af-e752-4c4d-868a-414c0bfd0104" (UID: "d446e6af-e752-4c4d-868a-414c0bfd0104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.015040 4810 scope.go:117] "RemoveContainer" containerID="0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2" Oct 03 07:17:35 crc kubenswrapper[4810]: E1003 07:17:35.015598 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2\": container with ID starting with 0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2 not found: ID does not exist" containerID="0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.015697 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2"} err="failed to get container status \"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2\": rpc error: code = NotFound desc = could not find container \"0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2\": container with ID starting with 0a4876da77aa738f6fdd145f653032ff4686cfd321c58d92d9b96307b6e1f0e2 not found: ID does not exist" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.015776 4810 scope.go:117] "RemoveContainer" containerID="92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f" Oct 03 07:17:35 crc kubenswrapper[4810]: E1003 07:17:35.018569 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f\": container with ID starting with 92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f not found: ID does not exist" containerID="92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.018609 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f"} err="failed to get container status \"92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f\": rpc error: code = NotFound desc = could not find container \"92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f\": container with ID starting with 92ee7497de3facd0b64bd52267303b5d4684c6f9934179ddb785d60b4b9e723f not found: ID does not exist" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052135 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052163 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052174 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4qrn\" (UniqueName: \"kubernetes.io/projected/d446e6af-e752-4c4d-868a-414c0bfd0104-kube-api-access-d4qrn\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052185 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052194 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.052202 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d446e6af-e752-4c4d-868a-414c0bfd0104-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.268661 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.274678 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78b6bb6597-jtfhd"] Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.312928 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" path="/var/lib/kubelet/pods/d446e6af-e752-4c4d-868a-414c0bfd0104/volumes" Oct 03 07:17:35 crc kubenswrapper[4810]: I1003 07:17:35.944815 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerStarted","Data":"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5"} Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.446532 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.582399 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.582671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psczn\" (UniqueName: \"kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.582711 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.582828 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.582967 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.583011 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id\") pod \"90792846-fe89-4337-8965-a6d8b372b80c\" (UID: \"90792846-fe89-4337-8965-a6d8b372b80c\") " Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.583112 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.583342 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90792846-fe89-4337-8965-a6d8b372b80c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.588155 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn" (OuterVolumeSpecName: "kube-api-access-psczn") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "kube-api-access-psczn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.588488 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts" (OuterVolumeSpecName: "scripts") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.588546 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.652086 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.685104 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.685132 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.685143 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.685151 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psczn\" (UniqueName: \"kubernetes.io/projected/90792846-fe89-4337-8965-a6d8b372b80c-kube-api-access-psczn\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.715001 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data" (OuterVolumeSpecName: "config-data") pod "90792846-fe89-4337-8965-a6d8b372b80c" (UID: "90792846-fe89-4337-8965-a6d8b372b80c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.787111 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90792846-fe89-4337-8965-a6d8b372b80c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957222 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerStarted","Data":"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773"} Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957346 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-central-agent" containerID="cri-o://c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0" gracePeriod=30 Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957375 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="proxy-httpd" containerID="cri-o://f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773" gracePeriod=30 Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957405 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="sg-core" containerID="cri-o://3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5" gracePeriod=30 Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957365 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.957402 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-notification-agent" containerID="cri-o://074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0" gracePeriod=30 Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.960054 4810 generic.go:334] "Generic (PLEG): container finished" podID="90792846-fe89-4337-8965-a6d8b372b80c" containerID="874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719" exitCode=0 Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.960098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerDied","Data":"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719"} Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.960133 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90792846-fe89-4337-8965-a6d8b372b80c","Type":"ContainerDied","Data":"359c7be9877d822de364923b37b61e041a31941aa025b30cd07223955a4ddb45"} Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.960157 4810 scope.go:117] "RemoveContainer" containerID="81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.960277 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.991199 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.4770099700000001 podStartE2EDuration="4.991175759s" podCreationTimestamp="2025-10-03 07:17:32 +0000 UTC" firstStartedPulling="2025-10-03 07:17:33.127968577 +0000 UTC m=+1286.555219312" lastFinishedPulling="2025-10-03 07:17:36.642134366 +0000 UTC m=+1290.069385101" observedRunningTime="2025-10-03 07:17:36.984268374 +0000 UTC m=+1290.411519129" watchObservedRunningTime="2025-10-03 07:17:36.991175759 +0000 UTC m=+1290.418426504" Oct 03 07:17:36 crc kubenswrapper[4810]: I1003 07:17:36.998059 4810 scope.go:117] "RemoveContainer" containerID="874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.024883 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.030190 4810 scope.go:117] "RemoveContainer" containerID="81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a" Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.034070 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a\": container with ID starting with 81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a not found: ID does not exist" containerID="81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.034130 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a"} err="failed to get container status \"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a\": rpc error: code = NotFound desc = could not find container \"81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a\": container with ID starting with 81aa76945cf392bd084ccba1e414ef28ccb4b61ae1410106927846cd579fe80a not found: ID does not exist" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.034163 4810 scope.go:117] "RemoveContainer" containerID="874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719" Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.034553 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719\": container with ID starting with 874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719 not found: ID does not exist" containerID="874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.034609 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719"} err="failed to get container status \"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719\": rpc error: code = NotFound desc = could not find container \"874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719\": container with ID starting with 874d9d2335aa6fc9ab57c50e1161c48379fee1958737c43c603a09c81e952719 not found: ID does not exist" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.035969 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.059969 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.060468 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="dnsmasq-dns" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060498 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="dnsmasq-dns" Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.060512 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="cinder-scheduler" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="cinder-scheduler" Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.060540 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="init" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060548 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="init" Oct 03 07:17:37 crc kubenswrapper[4810]: E1003 07:17:37.060567 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="probe" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060572 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="probe" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060740 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="cinder-scheduler" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060763 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="90792846-fe89-4337-8965-a6d8b372b80c" containerName="probe" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.060775 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d446e6af-e752-4c4d-868a-414c0bfd0104" containerName="dnsmasq-dns" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.061981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.063941 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.074462 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.193701 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.194183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxbv7\" (UniqueName: \"kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.194296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.194379 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.194483 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.194569 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296378 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296436 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296475 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296568 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.296629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxbv7\" (UniqueName: \"kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.297724 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.302553 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.303017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.303540 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.306590 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.313494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxbv7\" (UniqueName: \"kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7\") pod \"cinder-scheduler-0\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.316180 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90792846-fe89-4337-8965-a6d8b372b80c" path="/var/lib/kubelet/pods/90792846-fe89-4337-8965-a6d8b372b80c/volumes" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.391796 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.813294 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:17:37 crc kubenswrapper[4810]: W1003 07:17:37.821517 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a72f53c_a634_4457_9cab_3a4cbea90eae.slice/crio-946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b WatchSource:0}: Error finding container 946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b: Status 404 returned error can't find the container with id 946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.916865 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2bc5-account-create-zflhm"] Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.918663 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.921335 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.941127 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2bc5-account-create-zflhm"] Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.972590 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerStarted","Data":"946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b"} Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.979868 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerID="f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773" exitCode=0 Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.979916 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerID="3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5" exitCode=2 Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.979928 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerID="074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0" exitCode=0 Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.979969 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerDied","Data":"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773"} Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.979999 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerDied","Data":"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5"} Oct 03 07:17:37 crc kubenswrapper[4810]: I1003 07:17:37.980009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerDied","Data":"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0"} Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.015032 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpr6\" (UniqueName: \"kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6\") pod \"nova-api-2bc5-account-create-zflhm\" (UID: \"3605c50a-f081-4145-a88c-4c83b7541db1\") " pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.110711 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3325-account-create-b5gmg"] Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.112259 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.117315 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpr6\" (UniqueName: \"kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6\") pod \"nova-api-2bc5-account-create-zflhm\" (UID: \"3605c50a-f081-4145-a88c-4c83b7541db1\") " pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.118225 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.140548 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpr6\" (UniqueName: \"kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6\") pod \"nova-api-2bc5-account-create-zflhm\" (UID: \"3605c50a-f081-4145-a88c-4c83b7541db1\") " pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.142672 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3325-account-create-b5gmg"] Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.219240 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wq4\" (UniqueName: \"kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4\") pod \"nova-cell0-3325-account-create-b5gmg\" (UID: \"35fb793a-b8f4-4aa8-8a87-e3f3055200e6\") " pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.243443 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.320504 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wq4\" (UniqueName: \"kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4\") pod \"nova-cell0-3325-account-create-b5gmg\" (UID: \"35fb793a-b8f4-4aa8-8a87-e3f3055200e6\") " pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.325707 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6e76-account-create-29sdl"] Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.326830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.338105 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.346533 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wq4\" (UniqueName: \"kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4\") pod \"nova-cell0-3325-account-create-b5gmg\" (UID: \"35fb793a-b8f4-4aa8-8a87-e3f3055200e6\") " pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.356955 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6e76-account-create-29sdl"] Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.423332 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mrg\" (UniqueName: \"kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg\") pod \"nova-cell1-6e76-account-create-29sdl\" (UID: \"1f32340d-61ad-431b-912d-60c4087cc192\") " pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.443481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.527906 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mrg\" (UniqueName: \"kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg\") pod \"nova-cell1-6e76-account-create-29sdl\" (UID: \"1f32340d-61ad-431b-912d-60c4087cc192\") " pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.575709 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mrg\" (UniqueName: \"kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg\") pod \"nova-cell1-6e76-account-create-29sdl\" (UID: \"1f32340d-61ad-431b-912d-60c4087cc192\") " pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.763607 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:38 crc kubenswrapper[4810]: I1003 07:17:38.957874 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2bc5-account-create-zflhm"] Oct 03 07:17:39 crc kubenswrapper[4810]: I1003 07:17:39.022435 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bc5-account-create-zflhm" event={"ID":"3605c50a-f081-4145-a88c-4c83b7541db1","Type":"ContainerStarted","Data":"9fba2eee886157d00af89e2e64d2b81b8e75486fa2c868d8564b33364b3222f6"} Oct 03 07:17:39 crc kubenswrapper[4810]: I1003 07:17:39.034965 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3325-account-create-b5gmg"] Oct 03 07:17:39 crc kubenswrapper[4810]: I1003 07:17:39.062884 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerStarted","Data":"0247a4056b29a103d07032ed7a3c453d738bdaa7868ee5953410287fdcac3220"} Oct 03 07:17:39 crc kubenswrapper[4810]: I1003 07:17:39.295426 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6e76-account-create-29sdl"] Oct 03 07:17:39 crc kubenswrapper[4810]: I1003 07:17:39.414638 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.077066 4810 generic.go:334] "Generic (PLEG): container finished" podID="1f32340d-61ad-431b-912d-60c4087cc192" containerID="65b37b6e4a903e3fb2e71f4f76ac62abb4ac05735afaec0a7de87bdfc314c2c0" exitCode=0 Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.077255 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6e76-account-create-29sdl" event={"ID":"1f32340d-61ad-431b-912d-60c4087cc192","Type":"ContainerDied","Data":"65b37b6e4a903e3fb2e71f4f76ac62abb4ac05735afaec0a7de87bdfc314c2c0"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.077657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6e76-account-create-29sdl" event={"ID":"1f32340d-61ad-431b-912d-60c4087cc192","Type":"ContainerStarted","Data":"4f6b65baf527bfe32a93dea32f73226ffa8fd9f4ef9b864bc0f6f530ab941d19"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.081253 4810 generic.go:334] "Generic (PLEG): container finished" podID="35fb793a-b8f4-4aa8-8a87-e3f3055200e6" containerID="9d7aec1813e16ffaf6566acf654970929ac9f6bc63b472068381f3dc2dfc92e1" exitCode=0 Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.081308 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3325-account-create-b5gmg" event={"ID":"35fb793a-b8f4-4aa8-8a87-e3f3055200e6","Type":"ContainerDied","Data":"9d7aec1813e16ffaf6566acf654970929ac9f6bc63b472068381f3dc2dfc92e1"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.081332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3325-account-create-b5gmg" event={"ID":"35fb793a-b8f4-4aa8-8a87-e3f3055200e6","Type":"ContainerStarted","Data":"42c22ae5d85edfeb8dba56db40a26448f3fe3ab624a9afd9619d56415294a0d3"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.083199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerStarted","Data":"3216b2c3caf454d8a6c7df9a162014f7925ab9d4e0f9fb8999f1009a4df3da6f"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.084560 4810 generic.go:334] "Generic (PLEG): container finished" podID="3605c50a-f081-4145-a88c-4c83b7541db1" containerID="c247d711c2297e4a2d0cc422533a2100aac1e02fae1217c0313bfef19b61465c" exitCode=0 Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.084585 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bc5-account-create-zflhm" event={"ID":"3605c50a-f081-4145-a88c-4c83b7541db1","Type":"ContainerDied","Data":"c247d711c2297e4a2d0cc422533a2100aac1e02fae1217c0313bfef19b61465c"} Oct 03 07:17:40 crc kubenswrapper[4810]: I1003 07:17:40.154856 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.154833867 podStartE2EDuration="3.154833867s" podCreationTimestamp="2025-10-03 07:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:17:40.14745657 +0000 UTC m=+1293.574707345" watchObservedRunningTime="2025-10-03 07:17:40.154833867 +0000 UTC m=+1293.582084612" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.660087 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.664673 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.670078 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.793243 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbpr6\" (UniqueName: \"kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6\") pod \"3605c50a-f081-4145-a88c-4c83b7541db1\" (UID: \"3605c50a-f081-4145-a88c-4c83b7541db1\") " Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.793645 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5mrg\" (UniqueName: \"kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg\") pod \"1f32340d-61ad-431b-912d-60c4087cc192\" (UID: \"1f32340d-61ad-431b-912d-60c4087cc192\") " Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.793823 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5wq4\" (UniqueName: \"kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4\") pod \"35fb793a-b8f4-4aa8-8a87-e3f3055200e6\" (UID: \"35fb793a-b8f4-4aa8-8a87-e3f3055200e6\") " Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.800756 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4" (OuterVolumeSpecName: "kube-api-access-m5wq4") pod "35fb793a-b8f4-4aa8-8a87-e3f3055200e6" (UID: "35fb793a-b8f4-4aa8-8a87-e3f3055200e6"). InnerVolumeSpecName "kube-api-access-m5wq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.800847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg" (OuterVolumeSpecName: "kube-api-access-n5mrg") pod "1f32340d-61ad-431b-912d-60c4087cc192" (UID: "1f32340d-61ad-431b-912d-60c4087cc192"). InnerVolumeSpecName "kube-api-access-n5mrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.801136 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6" (OuterVolumeSpecName: "kube-api-access-hbpr6") pod "3605c50a-f081-4145-a88c-4c83b7541db1" (UID: "3605c50a-f081-4145-a88c-4c83b7541db1"). InnerVolumeSpecName "kube-api-access-hbpr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.896079 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5wq4\" (UniqueName: \"kubernetes.io/projected/35fb793a-b8f4-4aa8-8a87-e3f3055200e6-kube-api-access-m5wq4\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.896118 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbpr6\" (UniqueName: \"kubernetes.io/projected/3605c50a-f081-4145-a88c-4c83b7541db1-kube-api-access-hbpr6\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:41 crc kubenswrapper[4810]: I1003 07:17:41.896131 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5mrg\" (UniqueName: \"kubernetes.io/projected/1f32340d-61ad-431b-912d-60c4087cc192-kube-api-access-n5mrg\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.103989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6e76-account-create-29sdl" event={"ID":"1f32340d-61ad-431b-912d-60c4087cc192","Type":"ContainerDied","Data":"4f6b65baf527bfe32a93dea32f73226ffa8fd9f4ef9b864bc0f6f530ab941d19"} Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.104038 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6b65baf527bfe32a93dea32f73226ffa8fd9f4ef9b864bc0f6f530ab941d19" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.104454 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6e76-account-create-29sdl" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.105205 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3325-account-create-b5gmg" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.105224 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3325-account-create-b5gmg" event={"ID":"35fb793a-b8f4-4aa8-8a87-e3f3055200e6","Type":"ContainerDied","Data":"42c22ae5d85edfeb8dba56db40a26448f3fe3ab624a9afd9619d56415294a0d3"} Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.105257 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c22ae5d85edfeb8dba56db40a26448f3fe3ab624a9afd9619d56415294a0d3" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.106397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2bc5-account-create-zflhm" event={"ID":"3605c50a-f081-4145-a88c-4c83b7541db1","Type":"ContainerDied","Data":"9fba2eee886157d00af89e2e64d2b81b8e75486fa2c868d8564b33364b3222f6"} Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.106419 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fba2eee886157d00af89e2e64d2b81b8e75486fa2c868d8564b33364b3222f6" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.106478 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2bc5-account-create-zflhm" Oct 03 07:17:42 crc kubenswrapper[4810]: I1003 07:17:42.392610 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.329465 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t8rjv"] Oct 03 07:17:43 crc kubenswrapper[4810]: E1003 07:17:43.330872 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f32340d-61ad-431b-912d-60c4087cc192" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.330983 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f32340d-61ad-431b-912d-60c4087cc192" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: E1003 07:17:43.331065 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fb793a-b8f4-4aa8-8a87-e3f3055200e6" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.331134 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fb793a-b8f4-4aa8-8a87-e3f3055200e6" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: E1003 07:17:43.331245 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605c50a-f081-4145-a88c-4c83b7541db1" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.331318 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605c50a-f081-4145-a88c-4c83b7541db1" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.331657 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fb793a-b8f4-4aa8-8a87-e3f3055200e6" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.331760 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f32340d-61ad-431b-912d-60c4087cc192" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.331856 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605c50a-f081-4145-a88c-4c83b7541db1" containerName="mariadb-account-create" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.332671 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.338457 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.338924 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.339704 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-sszpn" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.357042 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t8rjv"] Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.424784 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cltx5\" (UniqueName: \"kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.424842 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.424885 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.424961 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.526484 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cltx5\" (UniqueName: \"kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.526717 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.526801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.528501 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.538574 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.545859 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.548869 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cltx5\" (UniqueName: \"kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.549611 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t8rjv\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.677704 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.678488 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.731547 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.731870 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.731919 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.731954 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.731979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwm5l\" (UniqueName: \"kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.732104 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.732164 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle\") pod \"faf3b5d8-eee9-499b-bc32-d184170a8162\" (UID: \"faf3b5d8-eee9-499b-bc32-d184170a8162\") " Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.732531 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.732807 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.735676 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts" (OuterVolumeSpecName: "scripts") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.736055 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l" (OuterVolumeSpecName: "kube-api-access-cwm5l") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "kube-api-access-cwm5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.768828 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.814349 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.833844 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwm5l\" (UniqueName: \"kubernetes.io/projected/faf3b5d8-eee9-499b-bc32-d184170a8162-kube-api-access-cwm5l\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.834045 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.834057 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.834069 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.834079 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/faf3b5d8-eee9-499b-bc32-d184170a8162-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.834090 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.872159 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data" (OuterVolumeSpecName: "config-data") pod "faf3b5d8-eee9-499b-bc32-d184170a8162" (UID: "faf3b5d8-eee9-499b-bc32-d184170a8162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:43 crc kubenswrapper[4810]: I1003 07:17:43.949359 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faf3b5d8-eee9-499b-bc32-d184170a8162-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.129450 4810 generic.go:334] "Generic (PLEG): container finished" podID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerID="c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0" exitCode=0 Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.129493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerDied","Data":"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0"} Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.129521 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"faf3b5d8-eee9-499b-bc32-d184170a8162","Type":"ContainerDied","Data":"9ce2134602c05df897f61a8c9bcb3287397c598988a482237255c3b97a5866e8"} Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.129549 4810 scope.go:117] "RemoveContainer" containerID="f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.129745 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.138485 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t8rjv"] Oct 03 07:17:44 crc kubenswrapper[4810]: W1003 07:17:44.151993 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34781cb2_747c_4d91_b346_725bb9b63394.slice/crio-49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183 WatchSource:0}: Error finding container 49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183: Status 404 returned error can't find the container with id 49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183 Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.156928 4810 scope.go:117] "RemoveContainer" containerID="3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.177248 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.181991 4810 scope.go:117] "RemoveContainer" containerID="074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.196686 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205224 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.205590 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="proxy-httpd" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205609 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="proxy-httpd" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.205633 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="sg-core" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205641 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="sg-core" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.205670 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-notification-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205679 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-notification-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.205697 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-central-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205705 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-central-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205965 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-notification-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.205987 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="proxy-httpd" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.206010 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="ceilometer-central-agent" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.206025 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" containerName="sg-core" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.206310 4810 scope.go:117] "RemoveContainer" containerID="c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.207849 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.215436 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.215626 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.218725 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.257561 4810 scope.go:117] "RemoveContainer" containerID="f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.258125 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773\": container with ID starting with f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773 not found: ID does not exist" containerID="f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.258173 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773"} err="failed to get container status \"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773\": rpc error: code = NotFound desc = could not find container \"f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773\": container with ID starting with f936207a55dad4f55e70d4ca1bce52aaa3e662ffbbcc0d65da5a700783d05773 not found: ID does not exist" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.258200 4810 scope.go:117] "RemoveContainer" containerID="3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.258637 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5\": container with ID starting with 3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5 not found: ID does not exist" containerID="3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.258673 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5"} err="failed to get container status \"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5\": rpc error: code = NotFound desc = could not find container \"3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5\": container with ID starting with 3b57339f2c084c06b960c4852a2a33415a20a94a69b797b4fa839bcf9d9baae5 not found: ID does not exist" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.258696 4810 scope.go:117] "RemoveContainer" containerID="074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.259155 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0\": container with ID starting with 074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0 not found: ID does not exist" containerID="074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.259272 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0"} err="failed to get container status \"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0\": rpc error: code = NotFound desc = could not find container \"074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0\": container with ID starting with 074cdce86d05d4b2dc5c03677f1e1306b90ad2956a47f4866dab0eb8c58bc4b0 not found: ID does not exist" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.259366 4810 scope.go:117] "RemoveContainer" containerID="c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0" Oct 03 07:17:44 crc kubenswrapper[4810]: E1003 07:17:44.259833 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0\": container with ID starting with c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0 not found: ID does not exist" containerID="c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.259857 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0"} err="failed to get container status \"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0\": rpc error: code = NotFound desc = could not find container \"c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0\": container with ID starting with c1b59c954e7b14e18e1122d8f69b12d2b99be1746bd724a77f66c39aa7d650a0 not found: ID does not exist" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.355920 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhd5t\" (UniqueName: \"kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.355967 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.356001 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.356031 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.356233 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.356251 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.356263 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.457838 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.457882 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.457917 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.457958 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhd5t\" (UniqueName: \"kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.457975 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.458002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.458028 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.459213 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.459387 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.462606 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.467759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.470417 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.470550 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.476646 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhd5t\" (UniqueName: \"kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t\") pod \"ceilometer-0\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " pod="openstack/ceilometer-0" Oct 03 07:17:44 crc kubenswrapper[4810]: I1003 07:17:44.534495 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:45 crc kubenswrapper[4810]: I1003 07:17:45.002012 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:45 crc kubenswrapper[4810]: I1003 07:17:45.153039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerStarted","Data":"0313fb538443a10963fe17fc56a1de308c2d80c11813030bec9e36bdb4feb1ec"} Oct 03 07:17:45 crc kubenswrapper[4810]: I1003 07:17:45.160802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" event={"ID":"34781cb2-747c-4d91-b346-725bb9b63394","Type":"ContainerStarted","Data":"49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183"} Oct 03 07:17:45 crc kubenswrapper[4810]: I1003 07:17:45.313555 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf3b5d8-eee9-499b-bc32-d184170a8162" path="/var/lib/kubelet/pods/faf3b5d8-eee9-499b-bc32-d184170a8162/volumes" Oct 03 07:17:46 crc kubenswrapper[4810]: I1003 07:17:46.177393 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerStarted","Data":"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1"} Oct 03 07:17:47 crc kubenswrapper[4810]: I1003 07:17:47.186903 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerStarted","Data":"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5"} Oct 03 07:17:47 crc kubenswrapper[4810]: I1003 07:17:47.724164 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 07:17:52 crc kubenswrapper[4810]: I1003 07:17:52.236175 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" event={"ID":"34781cb2-747c-4d91-b346-725bb9b63394","Type":"ContainerStarted","Data":"b1d5a09f50b97a0b1a6043433d3b498884f9e351468f8bfcf55a3f1d35912ab0"} Oct 03 07:17:52 crc kubenswrapper[4810]: I1003 07:17:52.241802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerStarted","Data":"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b"} Oct 03 07:17:52 crc kubenswrapper[4810]: I1003 07:17:52.259973 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" podStartSLOduration=1.879733829 podStartE2EDuration="9.259958318s" podCreationTimestamp="2025-10-03 07:17:43 +0000 UTC" firstStartedPulling="2025-10-03 07:17:44.157116247 +0000 UTC m=+1297.584366982" lastFinishedPulling="2025-10-03 07:17:51.537340736 +0000 UTC m=+1304.964591471" observedRunningTime="2025-10-03 07:17:52.254282046 +0000 UTC m=+1305.681532781" watchObservedRunningTime="2025-10-03 07:17:52.259958318 +0000 UTC m=+1305.687209053" Oct 03 07:17:53 crc kubenswrapper[4810]: I1003 07:17:53.265386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerStarted","Data":"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845"} Oct 03 07:17:53 crc kubenswrapper[4810]: I1003 07:17:53.292119 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.705056002 podStartE2EDuration="9.292103203s" podCreationTimestamp="2025-10-03 07:17:44 +0000 UTC" firstStartedPulling="2025-10-03 07:17:45.014992252 +0000 UTC m=+1298.442242987" lastFinishedPulling="2025-10-03 07:17:52.602039453 +0000 UTC m=+1306.029290188" observedRunningTime="2025-10-03 07:17:53.289621337 +0000 UTC m=+1306.716872092" watchObservedRunningTime="2025-10-03 07:17:53.292103203 +0000 UTC m=+1306.719353938" Oct 03 07:17:54 crc kubenswrapper[4810]: I1003 07:17:54.120360 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:17:54 crc kubenswrapper[4810]: I1003 07:17:54.278873 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:17:56 crc kubenswrapper[4810]: I1003 07:17:56.817840 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:56 crc kubenswrapper[4810]: I1003 07:17:56.818496 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-central-agent" containerID="cri-o://f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1" gracePeriod=30 Oct 03 07:17:56 crc kubenswrapper[4810]: I1003 07:17:56.818546 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="sg-core" containerID="cri-o://a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b" gracePeriod=30 Oct 03 07:17:56 crc kubenswrapper[4810]: I1003 07:17:56.818552 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="proxy-httpd" containerID="cri-o://bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845" gracePeriod=30 Oct 03 07:17:56 crc kubenswrapper[4810]: I1003 07:17:56.818631 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-notification-agent" containerID="cri-o://dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5" gracePeriod=30 Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.324686 4810 generic.go:334] "Generic (PLEG): container finished" podID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerID="bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845" exitCode=0 Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.324724 4810 generic.go:334] "Generic (PLEG): container finished" podID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerID="a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b" exitCode=2 Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.324733 4810 generic.go:334] "Generic (PLEG): container finished" podID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerID="f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1" exitCode=0 Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.328960 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerDied","Data":"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845"} Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.329032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerDied","Data":"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b"} Oct 03 07:17:57 crc kubenswrapper[4810]: I1003 07:17:57.329099 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerDied","Data":"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1"} Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.249064 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.333998 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.334258 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c56f97996-hbltc" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-api" containerID="cri-o://a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e" gracePeriod=30 Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.334542 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-c56f97996-hbltc" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-httpd" containerID="cri-o://a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b" gracePeriod=30 Oct 03 07:17:58 crc kubenswrapper[4810]: E1003 07:17:58.623001 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2768fe43_714c_4ffb_b075_c64d36287348.slice/crio-a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2768fe43_714c_4ffb_b075_c64d36287348.slice/crio-conmon-a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.769045 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839250 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839317 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839438 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839461 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhd5t\" (UniqueName: \"kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839612 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.839670 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd\") pod \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\" (UID: \"424f256d-3f6a-44d1-a1cd-2e283bb42b33\") " Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.840397 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.840509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.857438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t" (OuterVolumeSpecName: "kube-api-access-xhd5t") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "kube-api-access-xhd5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.860424 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts" (OuterVolumeSpecName: "scripts") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.872123 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.907847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.929984 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data" (OuterVolumeSpecName: "config-data") pod "424f256d-3f6a-44d1-a1cd-2e283bb42b33" (UID: "424f256d-3f6a-44d1-a1cd-2e283bb42b33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941210 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941243 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941252 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941260 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941271 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhd5t\" (UniqueName: \"kubernetes.io/projected/424f256d-3f6a-44d1-a1cd-2e283bb42b33-kube-api-access-xhd5t\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941279 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/424f256d-3f6a-44d1-a1cd-2e283bb42b33-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:58 crc kubenswrapper[4810]: I1003 07:17:58.941287 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/424f256d-3f6a-44d1-a1cd-2e283bb42b33-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.388400 4810 generic.go:334] "Generic (PLEG): container finished" podID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerID="dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5" exitCode=0 Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.388578 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerDied","Data":"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5"} Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.388728 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"424f256d-3f6a-44d1-a1cd-2e283bb42b33","Type":"ContainerDied","Data":"0313fb538443a10963fe17fc56a1de308c2d80c11813030bec9e36bdb4feb1ec"} Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.388746 4810 scope.go:117] "RemoveContainer" containerID="bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.388667 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.391667 4810 generic.go:334] "Generic (PLEG): container finished" podID="2768fe43-714c-4ffb-b075-c64d36287348" containerID="a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b" exitCode=0 Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.391701 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerDied","Data":"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b"} Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.420047 4810 scope.go:117] "RemoveContainer" containerID="a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.431592 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.442944 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.446312 4810 scope.go:117] "RemoveContainer" containerID="dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450043 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.450430 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-central-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450446 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-central-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.450458 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="proxy-httpd" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450464 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="proxy-httpd" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.450490 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="sg-core" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450496 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="sg-core" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.450515 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-notification-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450522 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-notification-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450714 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="sg-core" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450737 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-central-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450752 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="proxy-httpd" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.450760 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" containerName="ceilometer-notification-agent" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.452284 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.456944 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.459178 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.459377 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.487746 4810 scope.go:117] "RemoveContainer" containerID="f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.506024 4810 scope.go:117] "RemoveContainer" containerID="bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.506749 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845\": container with ID starting with bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845 not found: ID does not exist" containerID="bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.506798 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845"} err="failed to get container status \"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845\": rpc error: code = NotFound desc = could not find container \"bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845\": container with ID starting with bab2d57231f85001756e2853eb299b54891486ab2af94aa6d314a48661bc2845 not found: ID does not exist" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.506827 4810 scope.go:117] "RemoveContainer" containerID="a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.507320 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b\": container with ID starting with a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b not found: ID does not exist" containerID="a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.507357 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b"} err="failed to get container status \"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b\": rpc error: code = NotFound desc = could not find container \"a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b\": container with ID starting with a4f9fe6609e8d4fe92d97710f1170878156f7026ff0589503ca08f2ce44d8a5b not found: ID does not exist" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.507377 4810 scope.go:117] "RemoveContainer" containerID="dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.507632 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5\": container with ID starting with dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5 not found: ID does not exist" containerID="dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.507653 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5"} err="failed to get container status \"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5\": rpc error: code = NotFound desc = could not find container \"dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5\": container with ID starting with dcc9d30da38e181ccf7ac02ac58774acc5dae162a2570c7cf3dca001db1124d5 not found: ID does not exist" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.507686 4810 scope.go:117] "RemoveContainer" containerID="f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1" Oct 03 07:17:59 crc kubenswrapper[4810]: E1003 07:17:59.507967 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1\": container with ID starting with f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1 not found: ID does not exist" containerID="f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.508001 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1"} err="failed to get container status \"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1\": rpc error: code = NotFound desc = could not find container \"f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1\": container with ID starting with f614658a745aca161379d6093afbba2f067fd1c2302393cb1ea52a5e3b0b8fd1 not found: ID does not exist" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550705 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhhwk\" (UniqueName: \"kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550755 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550838 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550883 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550926 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550951 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.550994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.652506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653042 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653299 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653307 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhhwk\" (UniqueName: \"kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653740 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.653865 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.654456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.657407 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.659226 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.660055 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.672122 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.677865 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhhwk\" (UniqueName: \"kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk\") pod \"ceilometer-0\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " pod="openstack/ceilometer-0" Oct 03 07:17:59 crc kubenswrapper[4810]: I1003 07:17:59.768218 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.208805 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:00 crc kubenswrapper[4810]: W1003 07:18:00.210216 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad3cc8fa_7036_4fc5_883b_e2b955e35397.slice/crio-bb897473411aa5fdaee5c123949c73136ad59a4a5b71f3e333212df8bbadae21 WatchSource:0}: Error finding container bb897473411aa5fdaee5c123949c73136ad59a4a5b71f3e333212df8bbadae21: Status 404 returned error can't find the container with id bb897473411aa5fdaee5c123949c73136ad59a4a5b71f3e333212df8bbadae21 Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.334579 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.402699 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerStarted","Data":"bb897473411aa5fdaee5c123949c73136ad59a4a5b71f3e333212df8bbadae21"} Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.409424 4810 generic.go:334] "Generic (PLEG): container finished" podID="2768fe43-714c-4ffb-b075-c64d36287348" containerID="a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e" exitCode=0 Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.409465 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerDied","Data":"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e"} Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.409492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c56f97996-hbltc" event={"ID":"2768fe43-714c-4ffb-b075-c64d36287348","Type":"ContainerDied","Data":"f3c0a5967639e1dbed1de860844021067cd57e22c2202ea84dfb503ef6b40356"} Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.409508 4810 scope.go:117] "RemoveContainer" containerID="a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.409606 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c56f97996-hbltc" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.451530 4810 scope.go:117] "RemoveContainer" containerID="a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.470927 4810 scope.go:117] "RemoveContainer" containerID="a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.470937 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs\") pod \"2768fe43-714c-4ffb-b075-c64d36287348\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471081 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config\") pod \"2768fe43-714c-4ffb-b075-c64d36287348\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471193 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config\") pod \"2768fe43-714c-4ffb-b075-c64d36287348\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjpn\" (UniqueName: \"kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn\") pod \"2768fe43-714c-4ffb-b075-c64d36287348\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle\") pod \"2768fe43-714c-4ffb-b075-c64d36287348\" (UID: \"2768fe43-714c-4ffb-b075-c64d36287348\") " Oct 03 07:18:00 crc kubenswrapper[4810]: E1003 07:18:00.471783 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b\": container with ID starting with a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b not found: ID does not exist" containerID="a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471818 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b"} err="failed to get container status \"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b\": rpc error: code = NotFound desc = could not find container \"a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b\": container with ID starting with a3fd346a8c7decbf672164088dd3b151f659fd9c0f643c592458d4be881efd2b not found: ID does not exist" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.471844 4810 scope.go:117] "RemoveContainer" containerID="a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e" Oct 03 07:18:00 crc kubenswrapper[4810]: E1003 07:18:00.472513 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e\": container with ID starting with a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e not found: ID does not exist" containerID="a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.472563 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e"} err="failed to get container status \"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e\": rpc error: code = NotFound desc = could not find container \"a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e\": container with ID starting with a45b5c88a0de6fed29ef31c2621cab9fd8d82a3c0cf62a4ba6b2e80944c4980e not found: ID does not exist" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.482706 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn" (OuterVolumeSpecName: "kube-api-access-ggjpn") pod "2768fe43-714c-4ffb-b075-c64d36287348" (UID: "2768fe43-714c-4ffb-b075-c64d36287348"). InnerVolumeSpecName "kube-api-access-ggjpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.482795 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2768fe43-714c-4ffb-b075-c64d36287348" (UID: "2768fe43-714c-4ffb-b075-c64d36287348"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.520006 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2768fe43-714c-4ffb-b075-c64d36287348" (UID: "2768fe43-714c-4ffb-b075-c64d36287348"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.530738 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config" (OuterVolumeSpecName: "config") pod "2768fe43-714c-4ffb-b075-c64d36287348" (UID: "2768fe43-714c-4ffb-b075-c64d36287348"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.542504 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2768fe43-714c-4ffb-b075-c64d36287348" (UID: "2768fe43-714c-4ffb-b075-c64d36287348"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.573474 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.573503 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.573512 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjpn\" (UniqueName: \"kubernetes.io/projected/2768fe43-714c-4ffb-b075-c64d36287348-kube-api-access-ggjpn\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.573520 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.573529 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2768fe43-714c-4ffb-b075-c64d36287348-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.787572 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:18:00 crc kubenswrapper[4810]: I1003 07:18:00.798069 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c56f97996-hbltc"] Oct 03 07:18:01 crc kubenswrapper[4810]: I1003 07:18:01.318211 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2768fe43-714c-4ffb-b075-c64d36287348" path="/var/lib/kubelet/pods/2768fe43-714c-4ffb-b075-c64d36287348/volumes" Oct 03 07:18:01 crc kubenswrapper[4810]: I1003 07:18:01.319203 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="424f256d-3f6a-44d1-a1cd-2e283bb42b33" path="/var/lib/kubelet/pods/424f256d-3f6a-44d1-a1cd-2e283bb42b33/volumes" Oct 03 07:18:01 crc kubenswrapper[4810]: I1003 07:18:01.431522 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerStarted","Data":"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d"} Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.090740 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.091094 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.091135 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.091861 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.091951 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489" gracePeriod=600 Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.445361 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489"} Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.445640 4810 scope.go:117] "RemoveContainer" containerID="36ef9ac14686f7fe3d58791a3ec1d926fc16d91e160ef38a3e94c821c933639a" Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.445292 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489" exitCode=0 Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.445748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66"} Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.447859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerStarted","Data":"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a"} Oct 03 07:18:02 crc kubenswrapper[4810]: I1003 07:18:02.447887 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerStarted","Data":"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0"} Oct 03 07:18:03 crc kubenswrapper[4810]: I1003 07:18:03.465976 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerStarted","Data":"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a"} Oct 03 07:18:03 crc kubenswrapper[4810]: I1003 07:18:03.466533 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:18:03 crc kubenswrapper[4810]: I1003 07:18:03.468588 4810 generic.go:334] "Generic (PLEG): container finished" podID="34781cb2-747c-4d91-b346-725bb9b63394" containerID="b1d5a09f50b97a0b1a6043433d3b498884f9e351468f8bfcf55a3f1d35912ab0" exitCode=0 Oct 03 07:18:03 crc kubenswrapper[4810]: I1003 07:18:03.468661 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" event={"ID":"34781cb2-747c-4d91-b346-725bb9b63394","Type":"ContainerDied","Data":"b1d5a09f50b97a0b1a6043433d3b498884f9e351468f8bfcf55a3f1d35912ab0"} Oct 03 07:18:03 crc kubenswrapper[4810]: I1003 07:18:03.503284 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.50129158 podStartE2EDuration="4.503260036s" podCreationTimestamp="2025-10-03 07:17:59 +0000 UTC" firstStartedPulling="2025-10-03 07:18:00.214081044 +0000 UTC m=+1313.641331779" lastFinishedPulling="2025-10-03 07:18:03.2160495 +0000 UTC m=+1316.643300235" observedRunningTime="2025-10-03 07:18:03.490103734 +0000 UTC m=+1316.917354489" watchObservedRunningTime="2025-10-03 07:18:03.503260036 +0000 UTC m=+1316.930510781" Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.821828 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.952527 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle\") pod \"34781cb2-747c-4d91-b346-725bb9b63394\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.952613 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cltx5\" (UniqueName: \"kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5\") pod \"34781cb2-747c-4d91-b346-725bb9b63394\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.952656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts\") pod \"34781cb2-747c-4d91-b346-725bb9b63394\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.952688 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data\") pod \"34781cb2-747c-4d91-b346-725bb9b63394\" (UID: \"34781cb2-747c-4d91-b346-725bb9b63394\") " Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.958528 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5" (OuterVolumeSpecName: "kube-api-access-cltx5") pod "34781cb2-747c-4d91-b346-725bb9b63394" (UID: "34781cb2-747c-4d91-b346-725bb9b63394"). InnerVolumeSpecName "kube-api-access-cltx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.963679 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts" (OuterVolumeSpecName: "scripts") pod "34781cb2-747c-4d91-b346-725bb9b63394" (UID: "34781cb2-747c-4d91-b346-725bb9b63394"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.985498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34781cb2-747c-4d91-b346-725bb9b63394" (UID: "34781cb2-747c-4d91-b346-725bb9b63394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:04 crc kubenswrapper[4810]: I1003 07:18:04.997535 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data" (OuterVolumeSpecName: "config-data") pod "34781cb2-747c-4d91-b346-725bb9b63394" (UID: "34781cb2-747c-4d91-b346-725bb9b63394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.054396 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.054450 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cltx5\" (UniqueName: \"kubernetes.io/projected/34781cb2-747c-4d91-b346-725bb9b63394-kube-api-access-cltx5\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.054464 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.054474 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34781cb2-747c-4d91-b346-725bb9b63394-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.489250 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" event={"ID":"34781cb2-747c-4d91-b346-725bb9b63394","Type":"ContainerDied","Data":"49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183"} Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.489659 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49b243b1236cad4169cfa951c1a2d34d9a2365d4831f04837db3fa02f0d64183" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.489329 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t8rjv" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.630883 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:18:05 crc kubenswrapper[4810]: E1003 07:18:05.631338 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34781cb2-747c-4d91-b346-725bb9b63394" containerName="nova-cell0-conductor-db-sync" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631355 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="34781cb2-747c-4d91-b346-725bb9b63394" containerName="nova-cell0-conductor-db-sync" Oct 03 07:18:05 crc kubenswrapper[4810]: E1003 07:18:05.631376 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-api" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631385 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-api" Oct 03 07:18:05 crc kubenswrapper[4810]: E1003 07:18:05.631416 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-httpd" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631425 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-httpd" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631637 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="34781cb2-747c-4d91-b346-725bb9b63394" containerName="nova-cell0-conductor-db-sync" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631653 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-httpd" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.631669 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2768fe43-714c-4ffb-b075-c64d36287348" containerName="neutron-api" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.632418 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.635102 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-sszpn" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.638563 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.645050 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.777678 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.777834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.777917 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wb7\" (UniqueName: \"kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.880077 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wb7\" (UniqueName: \"kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.880254 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.880303 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.898219 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.900597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.908119 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wb7\" (UniqueName: \"kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7\") pod \"nova-cell0-conductor-0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:05 crc kubenswrapper[4810]: I1003 07:18:05.954688 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:06 crc kubenswrapper[4810]: I1003 07:18:06.461983 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:18:06 crc kubenswrapper[4810]: W1003 07:18:06.470100 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod589491ba_eae0_427c_8124_cacdcabd03c0.slice/crio-c322c03b1f868e344abca0f607ab2296b22ae489d6731a7d4d90393f735d359c WatchSource:0}: Error finding container c322c03b1f868e344abca0f607ab2296b22ae489d6731a7d4d90393f735d359c: Status 404 returned error can't find the container with id c322c03b1f868e344abca0f607ab2296b22ae489d6731a7d4d90393f735d359c Oct 03 07:18:06 crc kubenswrapper[4810]: I1003 07:18:06.500604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"589491ba-eae0-427c-8124-cacdcabd03c0","Type":"ContainerStarted","Data":"c322c03b1f868e344abca0f607ab2296b22ae489d6731a7d4d90393f735d359c"} Oct 03 07:18:07 crc kubenswrapper[4810]: I1003 07:18:07.514346 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"589491ba-eae0-427c-8124-cacdcabd03c0","Type":"ContainerStarted","Data":"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1"} Oct 03 07:18:07 crc kubenswrapper[4810]: I1003 07:18:07.515056 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:07 crc kubenswrapper[4810]: I1003 07:18:07.536587 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.536567786 podStartE2EDuration="2.536567786s" podCreationTimestamp="2025-10-03 07:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:07.531720416 +0000 UTC m=+1320.958971151" watchObservedRunningTime="2025-10-03 07:18:07.536567786 +0000 UTC m=+1320.963818521" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.001775 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.525776 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mn9pm"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.526868 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.529083 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.530486 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.543241 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mn9pm"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.626221 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.626588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76gxv\" (UniqueName: \"kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.626729 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.626764 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.706776 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.708296 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.712299 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.729233 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.729438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.729599 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76gxv\" (UniqueName: \"kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.729678 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.729707 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.736609 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.753755 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.766770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.769269 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76gxv\" (UniqueName: \"kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv\") pod \"nova-cell0-cell-mapping-mn9pm\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.831279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.831320 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz27c\" (UniqueName: \"kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.831357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.831386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.831617 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.833130 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.837681 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.845924 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.847206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.847458 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.853567 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.887437 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.888677 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.890169 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.898205 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.912705 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933149 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5mhs\" (UniqueName: \"kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933241 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933317 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933336 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz27c\" (UniqueName: \"kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933379 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933422 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933441 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933461 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933484 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.933545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5l46\" (UniqueName: \"kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.936111 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.939071 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.948650 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.951722 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz27c\" (UniqueName: \"kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c\") pod \"nova-metadata-0\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " pod="openstack/nova-metadata-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.977366 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.978825 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.986146 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 07:18:16 crc kubenswrapper[4810]: I1003 07:18:16.993038 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036025 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036087 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036119 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5l46\" (UniqueName: \"kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036161 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036201 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5mhs\" (UniqueName: \"kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036221 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrqm\" (UniqueName: \"kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036404 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036429 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036450 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.036467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.037307 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.037494 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.038045 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.038122 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.039846 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.041359 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.046593 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.062925 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.073493 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5mhs\" (UniqueName: \"kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs\") pod \"nova-api-0\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.075862 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5l46\" (UniqueName: \"kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46\") pod \"dnsmasq-dns-57b74674bf-z2ckg\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.138784 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrqm\" (UniqueName: \"kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.138861 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4m5x\" (UniqueName: \"kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.138974 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.139222 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.139283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.139735 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.142596 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.157557 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.168854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrqm\" (UniqueName: \"kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm\") pod \"nova-scheduler-0\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.194117 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.244427 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4m5x\" (UniqueName: \"kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.244573 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.244665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.251030 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.251347 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.275185 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4m5x\" (UniqueName: \"kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x\") pod \"nova-cell1-novncproxy-0\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.335832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.349808 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.400522 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.415765 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.438084 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mn9pm"] Oct 03 07:18:17 crc kubenswrapper[4810]: W1003 07:18:17.462098 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0b6f27_8285_42ab_99ee_35c1b6370602.slice/crio-04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93 WatchSource:0}: Error finding container 04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93: Status 404 returned error can't find the container with id 04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93 Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.592658 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dcgw8"] Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.593907 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.601750 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.601919 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.615474 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dcgw8"] Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.646327 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mn9pm" event={"ID":"5a0b6f27-8285-42ab-99ee-35c1b6370602","Type":"ContainerStarted","Data":"04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93"} Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.653831 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.653868 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.653936 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflrc\" (UniqueName: \"kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.654039 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.680730 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:17 crc kubenswrapper[4810]: W1003 07:18:17.687767 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba961e6_d6a2_4a88_914d_4c2a01e783b3.slice/crio-ae2a3440457fc0a2acc09ebe424932ca33ebfe46f5b5652cfe29972cd34c99af WatchSource:0}: Error finding container ae2a3440457fc0a2acc09ebe424932ca33ebfe46f5b5652cfe29972cd34c99af: Status 404 returned error can't find the container with id ae2a3440457fc0a2acc09ebe424932ca33ebfe46f5b5652cfe29972cd34c99af Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.755412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.755450 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.755479 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qflrc\" (UniqueName: \"kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.755581 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.764665 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.787463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflrc\" (UniqueName: \"kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.793065 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.793698 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dcgw8\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.925574 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:17 crc kubenswrapper[4810]: I1003 07:18:17.969789 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:17 crc kubenswrapper[4810]: W1003 07:18:17.982286 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9171b82_fdec_4a10_a2af_1366e15feafe.slice/crio-ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8 WatchSource:0}: Error finding container ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8: Status 404 returned error can't find the container with id ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8 Oct 03 07:18:18 crc kubenswrapper[4810]: W1003 07:18:18.041880 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39cfc321_3bad_4e79_8e28_5bd06d4f3c71.slice/crio-b2b321fa4b8438a1ab2771db5c515d7a4c4a634ac6710dd749c812cbb1c570e3 WatchSource:0}: Error finding container b2b321fa4b8438a1ab2771db5c515d7a4c4a634ac6710dd749c812cbb1c570e3: Status 404 returned error can't find the container with id b2b321fa4b8438a1ab2771db5c515d7a4c4a634ac6710dd749c812cbb1c570e3 Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.067766 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.095520 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:18 crc kubenswrapper[4810]: W1003 07:18:18.124904 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06504fdc_7e9c_453c_9661_3a921536ada2.slice/crio-5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf WatchSource:0}: Error finding container 5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf: Status 404 returned error can't find the container with id 5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.126114 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:18 crc kubenswrapper[4810]: W1003 07:18:18.430943 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc254e046_e923_4f68_8d96_e098b4ba354e.slice/crio-e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d WatchSource:0}: Error finding container e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d: Status 404 returned error can't find the container with id e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.445184 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dcgw8"] Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.656792 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06504fdc-7e9c-453c-9661-3a921536ada2","Type":"ContainerStarted","Data":"5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.658490 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerStarted","Data":"ae2a3440457fc0a2acc09ebe424932ca33ebfe46f5b5652cfe29972cd34c99af"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.659694 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerStarted","Data":"ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.660846 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"77edf7e6-7742-4af2-881f-006b87a2ce90","Type":"ContainerStarted","Data":"f919c57f184f2a8e1684ab3be3e022616ed40d8732ca734c531a537f0bdf721d"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.662244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerStarted","Data":"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.662278 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerStarted","Data":"b2b321fa4b8438a1ab2771db5c515d7a4c4a634ac6710dd749c812cbb1c570e3"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.663820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" event={"ID":"c254e046-e923-4f68-8d96-e098b4ba354e","Type":"ContainerStarted","Data":"e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.668492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mn9pm" event={"ID":"5a0b6f27-8285-42ab-99ee-35c1b6370602","Type":"ContainerStarted","Data":"af47ab8048ae97300d367ea13bca5094cd2d5e8c4bb3543b179640fe548d3c80"} Oct 03 07:18:18 crc kubenswrapper[4810]: I1003 07:18:18.700311 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mn9pm" podStartSLOduration=2.700291741 podStartE2EDuration="2.700291741s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:18.696436867 +0000 UTC m=+1332.123687602" watchObservedRunningTime="2025-10-03 07:18:18.700291741 +0000 UTC m=+1332.127542476" Oct 03 07:18:19 crc kubenswrapper[4810]: I1003 07:18:19.680302 4810 generic.go:334] "Generic (PLEG): container finished" podID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerID="eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448" exitCode=0 Oct 03 07:18:19 crc kubenswrapper[4810]: I1003 07:18:19.680385 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerDied","Data":"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448"} Oct 03 07:18:19 crc kubenswrapper[4810]: I1003 07:18:19.685125 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" event={"ID":"c254e046-e923-4f68-8d96-e098b4ba354e","Type":"ContainerStarted","Data":"5967e80a1430b2f097bc5422a0203827666c884ad43908ec20853a87042ca1d9"} Oct 03 07:18:19 crc kubenswrapper[4810]: I1003 07:18:19.728934 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" podStartSLOduration=2.7289170130000002 podStartE2EDuration="2.728917013s" podCreationTimestamp="2025-10-03 07:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:19.724934666 +0000 UTC m=+1333.152185411" watchObservedRunningTime="2025-10-03 07:18:19.728917013 +0000 UTC m=+1333.156167758" Oct 03 07:18:20 crc kubenswrapper[4810]: I1003 07:18:20.225531 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:20 crc kubenswrapper[4810]: I1003 07:18:20.233407 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:20 crc kubenswrapper[4810]: I1003 07:18:20.697232 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerStarted","Data":"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079"} Oct 03 07:18:20 crc kubenswrapper[4810]: I1003 07:18:20.716087 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" podStartSLOduration=4.716066293 podStartE2EDuration="4.716066293s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:20.714108049 +0000 UTC m=+1334.141358804" watchObservedRunningTime="2025-10-03 07:18:20.716066293 +0000 UTC m=+1334.143317038" Oct 03 07:18:21 crc kubenswrapper[4810]: I1003 07:18:21.705264 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.722340 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06504fdc-7e9c-453c-9661-3a921536ada2","Type":"ContainerStarted","Data":"daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.726869 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerStarted","Data":"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.726947 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerStarted","Data":"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.727048 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-metadata" containerID="cri-o://db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" gracePeriod=30 Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.727043 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-log" containerID="cri-o://065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" gracePeriod=30 Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.729784 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerStarted","Data":"c7d1d770ee99b774d99f0d4f9d1665eb327b3f5064b186c996508f4ca369c666"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.729836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerStarted","Data":"a45f333c001d7ab511200f81f6fa6f80c1ad4e082222f60409447082f87c2d1e"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.734200 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="77edf7e6-7742-4af2-881f-006b87a2ce90" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611" gracePeriod=30 Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.734271 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"77edf7e6-7742-4af2-881f-006b87a2ce90","Type":"ContainerStarted","Data":"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611"} Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.745901 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.14887179 podStartE2EDuration="6.745869559s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="2025-10-03 07:18:18.129396384 +0000 UTC m=+1331.556647109" lastFinishedPulling="2025-10-03 07:18:21.726394113 +0000 UTC m=+1335.153644878" observedRunningTime="2025-10-03 07:18:22.738945254 +0000 UTC m=+1336.166195989" watchObservedRunningTime="2025-10-03 07:18:22.745869559 +0000 UTC m=+1336.173120294" Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.774675 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.1488706300000002 podStartE2EDuration="6.774653471s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="2025-10-03 07:18:17.994663054 +0000 UTC m=+1331.421913789" lastFinishedPulling="2025-10-03 07:18:21.620445855 +0000 UTC m=+1335.047696630" observedRunningTime="2025-10-03 07:18:22.764281433 +0000 UTC m=+1336.191532178" watchObservedRunningTime="2025-10-03 07:18:22.774653471 +0000 UTC m=+1336.201904216" Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.788192 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.278404122 podStartE2EDuration="6.788171064s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="2025-10-03 07:18:18.107485357 +0000 UTC m=+1331.534736092" lastFinishedPulling="2025-10-03 07:18:21.617252299 +0000 UTC m=+1335.044503034" observedRunningTime="2025-10-03 07:18:22.77948384 +0000 UTC m=+1336.206734575" watchObservedRunningTime="2025-10-03 07:18:22.788171064 +0000 UTC m=+1336.215421799" Oct 03 07:18:22 crc kubenswrapper[4810]: I1003 07:18:22.806437 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8747386759999998 podStartE2EDuration="6.806419373s" podCreationTimestamp="2025-10-03 07:18:16 +0000 UTC" firstStartedPulling="2025-10-03 07:18:17.691001287 +0000 UTC m=+1331.118252022" lastFinishedPulling="2025-10-03 07:18:21.622681984 +0000 UTC m=+1335.049932719" observedRunningTime="2025-10-03 07:18:22.800294678 +0000 UTC m=+1336.227545413" watchObservedRunningTime="2025-10-03 07:18:22.806419373 +0000 UTC m=+1336.233670108" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.333382 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.376545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data\") pod \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.376773 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle\") pod \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.376921 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs\") pod \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.376977 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz27c\" (UniqueName: \"kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c\") pod \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\" (UID: \"1ba961e6-d6a2-4a88-914d-4c2a01e783b3\") " Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.377255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs" (OuterVolumeSpecName: "logs") pod "1ba961e6-d6a2-4a88-914d-4c2a01e783b3" (UID: "1ba961e6-d6a2-4a88-914d-4c2a01e783b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.377736 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.381991 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c" (OuterVolumeSpecName: "kube-api-access-pz27c") pod "1ba961e6-d6a2-4a88-914d-4c2a01e783b3" (UID: "1ba961e6-d6a2-4a88-914d-4c2a01e783b3"). InnerVolumeSpecName "kube-api-access-pz27c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.410643 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ba961e6-d6a2-4a88-914d-4c2a01e783b3" (UID: "1ba961e6-d6a2-4a88-914d-4c2a01e783b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.413973 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data" (OuterVolumeSpecName: "config-data") pod "1ba961e6-d6a2-4a88-914d-4c2a01e783b3" (UID: "1ba961e6-d6a2-4a88-914d-4c2a01e783b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.480248 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.480314 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz27c\" (UniqueName: \"kubernetes.io/projected/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-kube-api-access-pz27c\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.480330 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba961e6-d6a2-4a88-914d-4c2a01e783b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.745993 4810 generic.go:334] "Generic (PLEG): container finished" podID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerID="db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" exitCode=0 Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746345 4810 generic.go:334] "Generic (PLEG): container finished" podID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerID="065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" exitCode=143 Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746328 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746208 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerDied","Data":"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4"} Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746459 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerDied","Data":"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd"} Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746481 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1ba961e6-d6a2-4a88-914d-4c2a01e783b3","Type":"ContainerDied","Data":"ae2a3440457fc0a2acc09ebe424932ca33ebfe46f5b5652cfe29972cd34c99af"} Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.746504 4810 scope.go:117] "RemoveContainer" containerID="db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.790620 4810 scope.go:117] "RemoveContainer" containerID="065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.803908 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.813579 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.830418 4810 scope.go:117] "RemoveContainer" containerID="db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.831469 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:23 crc kubenswrapper[4810]: E1003 07:18:23.832076 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-log" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.832101 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-log" Oct 03 07:18:23 crc kubenswrapper[4810]: E1003 07:18:23.832119 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-metadata" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.832131 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-metadata" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.832473 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-log" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.832507 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" containerName="nova-metadata-metadata" Oct 03 07:18:23 crc kubenswrapper[4810]: E1003 07:18:23.841202 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4\": container with ID starting with db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4 not found: ID does not exist" containerID="db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841259 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4"} err="failed to get container status \"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4\": rpc error: code = NotFound desc = could not find container \"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4\": container with ID starting with db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4 not found: ID does not exist" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841287 4810 scope.go:117] "RemoveContainer" containerID="065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" Oct 03 07:18:23 crc kubenswrapper[4810]: E1003 07:18:23.841585 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd\": container with ID starting with 065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd not found: ID does not exist" containerID="065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841616 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd"} err="failed to get container status \"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd\": rpc error: code = NotFound desc = could not find container \"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd\": container with ID starting with 065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd not found: ID does not exist" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841634 4810 scope.go:117] "RemoveContainer" containerID="db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841959 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4"} err="failed to get container status \"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4\": rpc error: code = NotFound desc = could not find container \"db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4\": container with ID starting with db8930c1478dee95188b7bdd3507c627fbe1cf3beb06c16b3090b15cc0fa9dc4 not found: ID does not exist" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.841984 4810 scope.go:117] "RemoveContainer" containerID="065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.842215 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd"} err="failed to get container status \"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd\": rpc error: code = NotFound desc = could not find container \"065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd\": container with ID starting with 065443df9f427b2fbf14387ebd8444638f9f582daf459e562f3b9ee947c312cd not found: ID does not exist" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.849200 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.849314 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.853601 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.853882 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.893509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.893730 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.893905 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcpwr\" (UniqueName: \"kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.893960 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.894017 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.995275 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.995355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcpwr\" (UniqueName: \"kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.995389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.995428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.995459 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:23 crc kubenswrapper[4810]: I1003 07:18:23.996094 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.001087 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.001103 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.003474 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.021074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcpwr\" (UniqueName: \"kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr\") pod \"nova-metadata-0\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.171940 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.669331 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:24 crc kubenswrapper[4810]: W1003 07:18:24.679249 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5ac5026_8aab_4b9d_9579_fd782cdb27dc.slice/crio-9bf7c175fe30201c4d1288db5fb280b78d7e414ea547eccb6876ad6a02742fa0 WatchSource:0}: Error finding container 9bf7c175fe30201c4d1288db5fb280b78d7e414ea547eccb6876ad6a02742fa0: Status 404 returned error can't find the container with id 9bf7c175fe30201c4d1288db5fb280b78d7e414ea547eccb6876ad6a02742fa0 Oct 03 07:18:24 crc kubenswrapper[4810]: I1003 07:18:24.763615 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerStarted","Data":"9bf7c175fe30201c4d1288db5fb280b78d7e414ea547eccb6876ad6a02742fa0"} Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.328377 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba961e6-d6a2-4a88-914d-4c2a01e783b3" path="/var/lib/kubelet/pods/1ba961e6-d6a2-4a88-914d-4c2a01e783b3/volumes" Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.774070 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a0b6f27-8285-42ab-99ee-35c1b6370602" containerID="af47ab8048ae97300d367ea13bca5094cd2d5e8c4bb3543b179640fe548d3c80" exitCode=0 Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.774199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mn9pm" event={"ID":"5a0b6f27-8285-42ab-99ee-35c1b6370602","Type":"ContainerDied","Data":"af47ab8048ae97300d367ea13bca5094cd2d5e8c4bb3543b179640fe548d3c80"} Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.777293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerStarted","Data":"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032"} Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.777348 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerStarted","Data":"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1"} Oct 03 07:18:25 crc kubenswrapper[4810]: I1003 07:18:25.833294 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.833262064 podStartE2EDuration="2.833262064s" podCreationTimestamp="2025-10-03 07:18:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:25.822453835 +0000 UTC m=+1339.249704570" watchObservedRunningTime="2025-10-03 07:18:25.833262064 +0000 UTC m=+1339.260512799" Oct 03 07:18:26 crc kubenswrapper[4810]: I1003 07:18:26.788023 4810 generic.go:334] "Generic (PLEG): container finished" podID="c254e046-e923-4f68-8d96-e098b4ba354e" containerID="5967e80a1430b2f097bc5422a0203827666c884ad43908ec20853a87042ca1d9" exitCode=0 Oct 03 07:18:26 crc kubenswrapper[4810]: I1003 07:18:26.788078 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" event={"ID":"c254e046-e923-4f68-8d96-e098b4ba354e","Type":"ContainerDied","Data":"5967e80a1430b2f097bc5422a0203827666c884ad43908ec20853a87042ca1d9"} Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.147046 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.262688 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76gxv\" (UniqueName: \"kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv\") pod \"5a0b6f27-8285-42ab-99ee-35c1b6370602\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.262946 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data\") pod \"5a0b6f27-8285-42ab-99ee-35c1b6370602\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.263005 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts\") pod \"5a0b6f27-8285-42ab-99ee-35c1b6370602\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.263064 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle\") pod \"5a0b6f27-8285-42ab-99ee-35c1b6370602\" (UID: \"5a0b6f27-8285-42ab-99ee-35c1b6370602\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.269023 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts" (OuterVolumeSpecName: "scripts") pod "5a0b6f27-8285-42ab-99ee-35c1b6370602" (UID: "5a0b6f27-8285-42ab-99ee-35c1b6370602"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.272870 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv" (OuterVolumeSpecName: "kube-api-access-76gxv") pod "5a0b6f27-8285-42ab-99ee-35c1b6370602" (UID: "5a0b6f27-8285-42ab-99ee-35c1b6370602"). InnerVolumeSpecName "kube-api-access-76gxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.291750 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data" (OuterVolumeSpecName: "config-data") pod "5a0b6f27-8285-42ab-99ee-35c1b6370602" (UID: "5a0b6f27-8285-42ab-99ee-35c1b6370602"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.298215 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a0b6f27-8285-42ab-99ee-35c1b6370602" (UID: "5a0b6f27-8285-42ab-99ee-35c1b6370602"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.336278 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.336321 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.352159 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.365729 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.365781 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.365792 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a0b6f27-8285-42ab-99ee-35c1b6370602-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.365801 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76gxv\" (UniqueName: \"kubernetes.io/projected/5a0b6f27-8285-42ab-99ee-35c1b6370602-kube-api-access-76gxv\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.404800 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.404850 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.417078 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.426794 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.427086 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="dnsmasq-dns" containerID="cri-o://d44fefa290c5c27bc60dfb747c4b1bf38228704c17026548965bb1f674d616ee" gracePeriod=10 Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.479863 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.803625 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mn9pm" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.803621 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mn9pm" event={"ID":"5a0b6f27-8285-42ab-99ee-35c1b6370602","Type":"ContainerDied","Data":"04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93"} Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.804125 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04fe1c35597a2a98b124050c88c5f77cbe87ce8fe56e464a97754c682b133f93" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.806213 4810 generic.go:334] "Generic (PLEG): container finished" podID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerID="d44fefa290c5c27bc60dfb747c4b1bf38228704c17026548965bb1f674d616ee" exitCode=0 Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.806368 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" event={"ID":"9fec4478-2501-4fbe-90f2-05b2f239c265","Type":"ContainerDied","Data":"d44fefa290c5c27bc60dfb747c4b1bf38228704c17026548965bb1f674d616ee"} Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.854181 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.861249 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975589 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975637 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975717 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc2pl\" (UniqueName: \"kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975794 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.975929 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc\") pod \"9fec4478-2501-4fbe-90f2-05b2f239c265\" (UID: \"9fec4478-2501-4fbe-90f2-05b2f239c265\") " Oct 03 07:18:27 crc kubenswrapper[4810]: I1003 07:18:27.994385 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl" (OuterVolumeSpecName: "kube-api-access-gc2pl") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "kube-api-access-gc2pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.025954 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.026208 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-log" containerID="cri-o://a45f333c001d7ab511200f81f6fa6f80c1ad4e082222f60409447082f87c2d1e" gracePeriod=30 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.026296 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-api" containerID="cri-o://c7d1d770ee99b774d99f0d4f9d1665eb327b3f5064b186c996508f4ca369c666" gracePeriod=30 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.050919 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.051698 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": EOF" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.053098 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config" (OuterVolumeSpecName: "config") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.060357 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.061960 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.062138 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-log" containerID="cri-o://33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" gracePeriod=30 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.062275 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-metadata" containerID="cri-o://ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" gracePeriod=30 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.078119 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.078149 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.078158 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc2pl\" (UniqueName: \"kubernetes.io/projected/9fec4478-2501-4fbe-90f2-05b2f239c265-kube-api-access-gc2pl\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.184089 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.186959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.205434 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fec4478-2501-4fbe-90f2-05b2f239c265" (UID: "9fec4478-2501-4fbe-90f2-05b2f239c265"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.286804 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.287508 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.287609 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fec4478-2501-4fbe-90f2-05b2f239c265-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.342376 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.395828 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.490245 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qflrc\" (UniqueName: \"kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc\") pod \"c254e046-e923-4f68-8d96-e098b4ba354e\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.490316 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts\") pod \"c254e046-e923-4f68-8d96-e098b4ba354e\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.490362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle\") pod \"c254e046-e923-4f68-8d96-e098b4ba354e\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.490488 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data\") pod \"c254e046-e923-4f68-8d96-e098b4ba354e\" (UID: \"c254e046-e923-4f68-8d96-e098b4ba354e\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.494763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts" (OuterVolumeSpecName: "scripts") pod "c254e046-e923-4f68-8d96-e098b4ba354e" (UID: "c254e046-e923-4f68-8d96-e098b4ba354e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.497585 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc" (OuterVolumeSpecName: "kube-api-access-qflrc") pod "c254e046-e923-4f68-8d96-e098b4ba354e" (UID: "c254e046-e923-4f68-8d96-e098b4ba354e"). InnerVolumeSpecName "kube-api-access-qflrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.517997 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data" (OuterVolumeSpecName: "config-data") pod "c254e046-e923-4f68-8d96-e098b4ba354e" (UID: "c254e046-e923-4f68-8d96-e098b4ba354e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.526777 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c254e046-e923-4f68-8d96-e098b4ba354e" (UID: "c254e046-e923-4f68-8d96-e098b4ba354e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.563019 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.592541 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.592580 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qflrc\" (UniqueName: \"kubernetes.io/projected/c254e046-e923-4f68-8d96-e098b4ba354e-kube-api-access-qflrc\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.592593 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.592602 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c254e046-e923-4f68-8d96-e098b4ba354e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694101 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs\") pod \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694249 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle\") pod \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694319 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data\") pod \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694496 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs\") pod \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694579 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs" (OuterVolumeSpecName: "logs") pod "b5ac5026-8aab-4b9d-9579-fd782cdb27dc" (UID: "b5ac5026-8aab-4b9d-9579-fd782cdb27dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.694665 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcpwr\" (UniqueName: \"kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr\") pod \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\" (UID: \"b5ac5026-8aab-4b9d-9579-fd782cdb27dc\") " Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.695290 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.701185 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr" (OuterVolumeSpecName: "kube-api-access-qcpwr") pod "b5ac5026-8aab-4b9d-9579-fd782cdb27dc" (UID: "b5ac5026-8aab-4b9d-9579-fd782cdb27dc"). InnerVolumeSpecName "kube-api-access-qcpwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.724218 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data" (OuterVolumeSpecName: "config-data") pod "b5ac5026-8aab-4b9d-9579-fd782cdb27dc" (UID: "b5ac5026-8aab-4b9d-9579-fd782cdb27dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.726667 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5ac5026-8aab-4b9d-9579-fd782cdb27dc" (UID: "b5ac5026-8aab-4b9d-9579-fd782cdb27dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.758526 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b5ac5026-8aab-4b9d-9579-fd782cdb27dc" (UID: "b5ac5026-8aab-4b9d-9579-fd782cdb27dc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.797999 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcpwr\" (UniqueName: \"kubernetes.io/projected/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-kube-api-access-qcpwr\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.798043 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.798064 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.798075 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5ac5026-8aab-4b9d-9579-fd782cdb27dc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.817940 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" event={"ID":"9fec4478-2501-4fbe-90f2-05b2f239c265","Type":"ContainerDied","Data":"9570473a79cb996d338ab1fc31d7e02e09ceaa566a0882ba24761715efeff89a"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.818004 4810 scope.go:117] "RemoveContainer" containerID="d44fefa290c5c27bc60dfb747c4b1bf38228704c17026548965bb1f674d616ee" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.818166 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7ddd66c7-gl95v" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827435 4810 generic.go:334] "Generic (PLEG): container finished" podID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerID="ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" exitCode=0 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827536 4810 generic.go:334] "Generic (PLEG): container finished" podID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerID="33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" exitCode=143 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827517 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerDied","Data":"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827624 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerDied","Data":"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.827645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5ac5026-8aab-4b9d-9579-fd782cdb27dc","Type":"ContainerDied","Data":"9bf7c175fe30201c4d1288db5fb280b78d7e414ea547eccb6876ad6a02742fa0"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.831209 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerID="a45f333c001d7ab511200f81f6fa6f80c1ad4e082222f60409447082f87c2d1e" exitCode=143 Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.831294 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerDied","Data":"a45f333c001d7ab511200f81f6fa6f80c1ad4e082222f60409447082f87c2d1e"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.844545 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.847106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dcgw8" event={"ID":"c254e046-e923-4f68-8d96-e098b4ba354e","Type":"ContainerDied","Data":"e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d"} Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.847160 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e26f949ea2b159553a05b2715c3d7bd64662aa12c852823db68781822b0ca60d" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.848315 4810 scope.go:117] "RemoveContainer" containerID="ed916efd4e80a6cf41b59abf0b2e8e60cf3f2dd6f2963d367638138bc4d8bce5" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.879945 4810 scope.go:117] "RemoveContainer" containerID="ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.911952 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.938110 4810 scope.go:117] "RemoveContainer" containerID="33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.949917 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f7ddd66c7-gl95v"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.972355 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.978401 4810 scope.go:117] "RemoveContainer" containerID="ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.978838 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032\": container with ID starting with ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032 not found: ID does not exist" containerID="ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.978880 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032"} err="failed to get container status \"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032\": rpc error: code = NotFound desc = could not find container \"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032\": container with ID starting with ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032 not found: ID does not exist" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.978981 4810 scope.go:117] "RemoveContainer" containerID="33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.979406 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1\": container with ID starting with 33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1 not found: ID does not exist" containerID="33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.979464 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1"} err="failed to get container status \"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1\": rpc error: code = NotFound desc = could not find container \"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1\": container with ID starting with 33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1 not found: ID does not exist" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.979488 4810 scope.go:117] "RemoveContainer" containerID="ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.979757 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032"} err="failed to get container status \"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032\": rpc error: code = NotFound desc = could not find container \"ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032\": container with ID starting with ff43a8302db8dad5e5a9858ba02e2a68ed49c1e6d70d558550f10e346b344032 not found: ID does not exist" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.979783 4810 scope.go:117] "RemoveContainer" containerID="33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.979882 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980025 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1"} err="failed to get container status \"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1\": rpc error: code = NotFound desc = could not find container \"33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1\": container with ID starting with 33d7ddc1d54b48af05114f97f2413526838db4247293b0ebf42ac2cbb44ca0d1 not found: ID does not exist" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980324 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0b6f27-8285-42ab-99ee-35c1b6370602" containerName="nova-manage" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980341 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0b6f27-8285-42ab-99ee-35c1b6370602" containerName="nova-manage" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980359 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c254e046-e923-4f68-8d96-e098b4ba354e" containerName="nova-cell1-conductor-db-sync" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980365 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c254e046-e923-4f68-8d96-e098b4ba354e" containerName="nova-cell1-conductor-db-sync" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980386 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-log" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980392 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-log" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980435 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="dnsmasq-dns" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980443 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="dnsmasq-dns" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980454 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="init" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980459 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="init" Oct 03 07:18:28 crc kubenswrapper[4810]: E1003 07:18:28.980470 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-metadata" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980475 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-metadata" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980637 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-log" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980656 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c254e046-e923-4f68-8d96-e098b4ba354e" containerName="nova-cell1-conductor-db-sync" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980666 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0b6f27-8285-42ab-99ee-35c1b6370602" containerName="nova-manage" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980679 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" containerName="dnsmasq-dns" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.980687 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" containerName="nova-metadata-metadata" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.981402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.984268 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.986802 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:28 crc kubenswrapper[4810]: I1003 07:18:28.995523 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.004957 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.007209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.009706 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.011395 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.011621 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw295\" (UniqueName: \"kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103614 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htfz\" (UniqueName: \"kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103689 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103768 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.103812 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.205690 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.205781 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.205807 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5htfz\" (UniqueName: \"kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.205912 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.206018 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.206052 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.206124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.206188 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw295\" (UniqueName: \"kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.207767 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.209846 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.209945 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.211821 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.213055 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.225564 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.238287 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw295\" (UniqueName: \"kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295\") pod \"nova-cell1-conductor-0\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.239083 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5htfz\" (UniqueName: \"kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz\") pod \"nova-metadata-0\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.303055 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.315803 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fec4478-2501-4fbe-90f2-05b2f239c265" path="/var/lib/kubelet/pods/9fec4478-2501-4fbe-90f2-05b2f239c265/volumes" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.316913 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ac5026-8aab-4b9d-9579-fd782cdb27dc" path="/var/lib/kubelet/pods/b5ac5026-8aab-4b9d-9579-fd782cdb27dc/volumes" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.325711 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.782535 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.823049 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:18:29 crc kubenswrapper[4810]: W1003 07:18:29.834661 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c0ac24c_ed16_445a_b135_41b696510fc8.slice/crio-5b4ea0be33fbe46ebdbfeeec75b259e170352ba801146c919041ff92c32572c2 WatchSource:0}: Error finding container 5b4ea0be33fbe46ebdbfeeec75b259e170352ba801146c919041ff92c32572c2: Status 404 returned error can't find the container with id 5b4ea0be33fbe46ebdbfeeec75b259e170352ba801146c919041ff92c32572c2 Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.836182 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.877460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6c0ac24c-ed16-445a-b135-41b696510fc8","Type":"ContainerStarted","Data":"5b4ea0be33fbe46ebdbfeeec75b259e170352ba801146c919041ff92c32572c2"} Oct 03 07:18:29 crc kubenswrapper[4810]: I1003 07:18:29.892687 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" containerName="nova-scheduler-scheduler" containerID="cri-o://daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" gracePeriod=30 Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.902251 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6c0ac24c-ed16-445a-b135-41b696510fc8","Type":"ContainerStarted","Data":"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924"} Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.902754 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.905127 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerStarted","Data":"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62"} Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.905187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerStarted","Data":"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba"} Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.905208 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerStarted","Data":"7c3665e1744316b45a294248855a8848c46a7d9b62c0d8d10908c28886d936bf"} Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.924352 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.924333737 podStartE2EDuration="2.924333737s" podCreationTimestamp="2025-10-03 07:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:30.92108978 +0000 UTC m=+1344.348340535" watchObservedRunningTime="2025-10-03 07:18:30.924333737 +0000 UTC m=+1344.351584462" Oct 03 07:18:30 crc kubenswrapper[4810]: I1003 07:18:30.964817 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.964790661 podStartE2EDuration="2.964790661s" podCreationTimestamp="2025-10-03 07:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:30.946870191 +0000 UTC m=+1344.374120956" watchObservedRunningTime="2025-10-03 07:18:30.964790661 +0000 UTC m=+1344.392041416" Oct 03 07:18:32 crc kubenswrapper[4810]: E1003 07:18:32.407062 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:18:32 crc kubenswrapper[4810]: E1003 07:18:32.409149 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:18:32 crc kubenswrapper[4810]: E1003 07:18:32.410352 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:18:32 crc kubenswrapper[4810]: E1003 07:18:32.410395 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" containerName="nova-scheduler-scheduler" Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.941757 4810 generic.go:334] "Generic (PLEG): container finished" podID="06504fdc-7e9c-453c-9661-3a921536ada2" containerID="daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" exitCode=0 Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.942151 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06504fdc-7e9c-453c-9661-3a921536ada2","Type":"ContainerDied","Data":"daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744"} Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.942256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06504fdc-7e9c-453c-9661-3a921536ada2","Type":"ContainerDied","Data":"5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf"} Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.942268 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc5e96c1cc69a88d68fb5986904b0a696d5085dd9cc4bd71d24ca40ca0bdabf" Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.944357 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerID="c7d1d770ee99b774d99f0d4f9d1665eb327b3f5064b186c996508f4ca369c666" exitCode=0 Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.944381 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerDied","Data":"c7d1d770ee99b774d99f0d4f9d1665eb327b3f5064b186c996508f4ca369c666"} Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.944467 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c9171b82-fdec-4a10-a2af-1366e15feafe","Type":"ContainerDied","Data":"ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8"} Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.944482 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebc4b8f0b71e9351b8cd069fe60751b74623f311aff4f8d2a5f18a65cb9117c8" Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.944781 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.951487 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.996741 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs\") pod \"c9171b82-fdec-4a10-a2af-1366e15feafe\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.996814 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle\") pod \"c9171b82-fdec-4a10-a2af-1366e15feafe\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.996853 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data\") pod \"06504fdc-7e9c-453c-9661-3a921536ada2\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.996940 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data\") pod \"c9171b82-fdec-4a10-a2af-1366e15feafe\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.997051 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle\") pod \"06504fdc-7e9c-453c-9661-3a921536ada2\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.997083 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzrqm\" (UniqueName: \"kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm\") pod \"06504fdc-7e9c-453c-9661-3a921536ada2\" (UID: \"06504fdc-7e9c-453c-9661-3a921536ada2\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.997178 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5mhs\" (UniqueName: \"kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs\") pod \"c9171b82-fdec-4a10-a2af-1366e15feafe\" (UID: \"c9171b82-fdec-4a10-a2af-1366e15feafe\") " Oct 03 07:18:33 crc kubenswrapper[4810]: I1003 07:18:33.997479 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs" (OuterVolumeSpecName: "logs") pod "c9171b82-fdec-4a10-a2af-1366e15feafe" (UID: "c9171b82-fdec-4a10-a2af-1366e15feafe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.003483 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs" (OuterVolumeSpecName: "kube-api-access-q5mhs") pod "c9171b82-fdec-4a10-a2af-1366e15feafe" (UID: "c9171b82-fdec-4a10-a2af-1366e15feafe"). InnerVolumeSpecName "kube-api-access-q5mhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.004789 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm" (OuterVolumeSpecName: "kube-api-access-rzrqm") pod "06504fdc-7e9c-453c-9661-3a921536ada2" (UID: "06504fdc-7e9c-453c-9661-3a921536ada2"). InnerVolumeSpecName "kube-api-access-rzrqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.027650 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9171b82-fdec-4a10-a2af-1366e15feafe" (UID: "c9171b82-fdec-4a10-a2af-1366e15feafe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.032536 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data" (OuterVolumeSpecName: "config-data") pod "c9171b82-fdec-4a10-a2af-1366e15feafe" (UID: "c9171b82-fdec-4a10-a2af-1366e15feafe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.035933 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06504fdc-7e9c-453c-9661-3a921536ada2" (UID: "06504fdc-7e9c-453c-9661-3a921536ada2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.044232 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data" (OuterVolumeSpecName: "config-data") pod "06504fdc-7e9c-453c-9661-3a921536ada2" (UID: "06504fdc-7e9c-453c-9661-3a921536ada2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.098690 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099026 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099038 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzrqm\" (UniqueName: \"kubernetes.io/projected/06504fdc-7e9c-453c-9661-3a921536ada2-kube-api-access-rzrqm\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099050 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5mhs\" (UniqueName: \"kubernetes.io/projected/c9171b82-fdec-4a10-a2af-1366e15feafe-kube-api-access-q5mhs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099059 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9171b82-fdec-4a10-a2af-1366e15feafe-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099067 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9171b82-fdec-4a10-a2af-1366e15feafe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.099078 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06504fdc-7e9c-453c-9661-3a921536ada2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.290813 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.291133 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" containerName="kube-state-metrics" containerID="cri-o://d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f" gracePeriod=30 Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.326762 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.326821 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.879845 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.914352 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxj8r\" (UniqueName: \"kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r\") pod \"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c\" (UID: \"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c\") " Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.921178 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r" (OuterVolumeSpecName: "kube-api-access-mxj8r") pod "99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" (UID: "99d4cb8b-1033-4fc9-b5dd-ad30b89a754c"). InnerVolumeSpecName "kube-api-access-mxj8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.959926 4810 generic.go:334] "Generic (PLEG): container finished" podID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" containerID="d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f" exitCode=2 Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.959999 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.960022 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c","Type":"ContainerDied","Data":"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f"} Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.960051 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.960067 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"99d4cb8b-1033-4fc9-b5dd-ad30b89a754c","Type":"ContainerDied","Data":"3ad66bca48961feb88f68c26578a8f2c2fc44ddc1d6cedf80a471be1d61d8e54"} Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.960086 4810 scope.go:117] "RemoveContainer" containerID="d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f" Oct 03 07:18:34 crc kubenswrapper[4810]: I1003 07:18:34.960114 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.010995 4810 scope.go:117] "RemoveContainer" containerID="d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f" Oct 03 07:18:35 crc kubenswrapper[4810]: E1003 07:18:35.011599 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f\": container with ID starting with d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f not found: ID does not exist" containerID="d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.011650 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f"} err="failed to get container status \"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f\": rpc error: code = NotFound desc = could not find container \"d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f\": container with ID starting with d65cae36d64196eb1968c412aeb1ad14cb82179646440e98ff9eb067fe88eb6f not found: ID does not exist" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.020631 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxj8r\" (UniqueName: \"kubernetes.io/projected/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c-kube-api-access-mxj8r\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.030305 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.048712 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.063462 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: E1003 07:18:35.063988 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" containerName="kube-state-metrics" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064012 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" containerName="kube-state-metrics" Oct 03 07:18:35 crc kubenswrapper[4810]: E1003 07:18:35.064028 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-api" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064034 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-api" Oct 03 07:18:35 crc kubenswrapper[4810]: E1003 07:18:35.064049 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-log" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064056 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-log" Oct 03 07:18:35 crc kubenswrapper[4810]: E1003 07:18:35.064069 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" containerName="nova-scheduler-scheduler" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064086 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" containerName="nova-scheduler-scheduler" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064326 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-log" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064347 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" containerName="nova-api-api" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064360 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" containerName="kube-state-metrics" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.064383 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" containerName="nova-scheduler-scheduler" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.065775 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.075561 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.076368 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.094702 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.108448 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.120519 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.121735 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.121789 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.121821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hzwg\" (UniqueName: \"kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.122086 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.132146 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.148356 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.151610 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.153996 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.156205 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.164834 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.176047 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.176158 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.180455 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.203295 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.223265 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.223320 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.223343 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.223633 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6bv\" (UniqueName: \"kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.224194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.224281 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.224338 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hzwg\" (UniqueName: \"kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.225195 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.225375 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.225437 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.225535 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxqr\" (UniqueName: \"kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.230357 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.234963 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.235038 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.251074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hzwg\" (UniqueName: \"kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg\") pod \"nova-api-0\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.317818 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06504fdc-7e9c-453c-9661-3a921536ada2" path="/var/lib/kubelet/pods/06504fdc-7e9c-453c-9661-3a921536ada2/volumes" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.318677 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d4cb8b-1033-4fc9-b5dd-ad30b89a754c" path="/var/lib/kubelet/pods/99d4cb8b-1033-4fc9-b5dd-ad30b89a754c/volumes" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.319424 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9171b82-fdec-4a10-a2af-1366e15feafe" path="/var/lib/kubelet/pods/c9171b82-fdec-4a10-a2af-1366e15feafe/volumes" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6bv\" (UniqueName: \"kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327314 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxqr\" (UniqueName: \"kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327545 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.327576 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.331460 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.332134 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.332355 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.332813 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.334081 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.347302 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6bv\" (UniqueName: \"kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv\") pod \"nova-scheduler-0\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.348037 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxqr\" (UniqueName: \"kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr\") pod \"kube-state-metrics-0\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.396405 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.473058 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.493586 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.756321 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:18:35 crc kubenswrapper[4810]: I1003 07:18:35.972587 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerStarted","Data":"0b6da545f747d11a7ee3d6163497447c6381e70932dc2cbb2758d44063d75534"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.044365 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:18:36 crc kubenswrapper[4810]: W1003 07:18:36.054234 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb6eb4ae_ef62_480d_80a9_411be811155b.slice/crio-46d3634a1200e9e509db8afd33ab98d129832c208ae10bc6f67192a5b9adc6e0 WatchSource:0}: Error finding container 46d3634a1200e9e509db8afd33ab98d129832c208ae10bc6f67192a5b9adc6e0: Status 404 returned error can't find the container with id 46d3634a1200e9e509db8afd33ab98d129832c208ae10bc6f67192a5b9adc6e0 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.105620 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:18:36 crc kubenswrapper[4810]: W1003 07:18:36.107770 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod873ef77f_c12d_4c61_bf40_7b49e6988629.slice/crio-e2ace4166333370775b244b2af4191d6f73c93f622d8531f278a2cfd489fe5dc WatchSource:0}: Error finding container e2ace4166333370775b244b2af4191d6f73c93f622d8531f278a2cfd489fe5dc: Status 404 returned error can't find the container with id e2ace4166333370775b244b2af4191d6f73c93f622d8531f278a2cfd489fe5dc Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.493337 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.494225 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-central-agent" containerID="cri-o://b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d" gracePeriod=30 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.494647 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="proxy-httpd" containerID="cri-o://4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a" gracePeriod=30 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.494770 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-notification-agent" containerID="cri-o://ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0" gracePeriod=30 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.494832 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="sg-core" containerID="cri-o://437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a" gracePeriod=30 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.983123 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerStarted","Data":"93efb94f140bc50a949f93e2e63517e8845aa87508f6c501792f6a74a557b800"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.983378 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerStarted","Data":"97e7825e9f72346aba58ffc17325579724021ef5c4b09256729c000b1df3ba80"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987480 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerID="4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a" exitCode=0 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987504 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerID="437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a" exitCode=2 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987511 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerID="b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d" exitCode=0 Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987543 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerDied","Data":"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987561 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerDied","Data":"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.987570 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerDied","Data":"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.989935 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb6eb4ae-ef62-480d-80a9-411be811155b","Type":"ContainerStarted","Data":"2453516ab32d8b1790716264b607d141d5a90ef1a6cefa4be573a9c5999f1aee"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.989983 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb6eb4ae-ef62-480d-80a9-411be811155b","Type":"ContainerStarted","Data":"46d3634a1200e9e509db8afd33ab98d129832c208ae10bc6f67192a5b9adc6e0"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.991102 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.992386 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"873ef77f-c12d-4c61-bf40-7b49e6988629","Type":"ContainerStarted","Data":"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab"} Oct 03 07:18:36 crc kubenswrapper[4810]: I1003 07:18:36.992409 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"873ef77f-c12d-4c61-bf40-7b49e6988629","Type":"ContainerStarted","Data":"e2ace4166333370775b244b2af4191d6f73c93f622d8531f278a2cfd489fe5dc"} Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.025587 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.025564596 podStartE2EDuration="2.025564596s" podCreationTimestamp="2025-10-03 07:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:37.022633748 +0000 UTC m=+1350.449884493" watchObservedRunningTime="2025-10-03 07:18:37.025564596 +0000 UTC m=+1350.452815331" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.035144 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.035123823 podStartE2EDuration="2.035123823s" podCreationTimestamp="2025-10-03 07:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:37.003498025 +0000 UTC m=+1350.430748780" watchObservedRunningTime="2025-10-03 07:18:37.035123823 +0000 UTC m=+1350.462374568" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.053614 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.66926394 podStartE2EDuration="2.053591637s" podCreationTimestamp="2025-10-03 07:18:35 +0000 UTC" firstStartedPulling="2025-10-03 07:18:36.057535339 +0000 UTC m=+1349.484786074" lastFinishedPulling="2025-10-03 07:18:36.441863036 +0000 UTC m=+1349.869113771" observedRunningTime="2025-10-03 07:18:37.042028298 +0000 UTC m=+1350.469279043" watchObservedRunningTime="2025-10-03 07:18:37.053591637 +0000 UTC m=+1350.480842372" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.701471 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.774933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775322 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhhwk\" (UniqueName: \"kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775529 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775554 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775600 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.775635 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml\") pod \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\" (UID: \"ad3cc8fa-7036-4fc5-883b-e2b955e35397\") " Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.776637 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.776992 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.793768 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts" (OuterVolumeSpecName: "scripts") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.793910 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk" (OuterVolumeSpecName: "kube-api-access-bhhwk") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "kube-api-access-bhhwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.819595 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.877228 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.877438 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad3cc8fa-7036-4fc5-883b-e2b955e35397-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.877545 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.877640 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.877720 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhhwk\" (UniqueName: \"kubernetes.io/projected/ad3cc8fa-7036-4fc5-883b-e2b955e35397-kube-api-access-bhhwk\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.894329 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.947913 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data" (OuterVolumeSpecName: "config-data") pod "ad3cc8fa-7036-4fc5-883b-e2b955e35397" (UID: "ad3cc8fa-7036-4fc5-883b-e2b955e35397"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.980103 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:37 crc kubenswrapper[4810]: I1003 07:18:37.982946 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3cc8fa-7036-4fc5-883b-e2b955e35397-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.010920 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerID="ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0" exitCode=0 Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.011884 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.016707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerDied","Data":"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0"} Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.016788 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad3cc8fa-7036-4fc5-883b-e2b955e35397","Type":"ContainerDied","Data":"bb897473411aa5fdaee5c123949c73136ad59a4a5b71f3e333212df8bbadae21"} Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.016808 4810 scope.go:117] "RemoveContainer" containerID="4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.051510 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.060418 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.060767 4810 scope.go:117] "RemoveContainer" containerID="437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076172 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.076520 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-notification-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076537 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-notification-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.076557 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="proxy-httpd" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076563 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="proxy-httpd" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.076582 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="sg-core" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076588 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="sg-core" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.076610 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-central-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076617 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-central-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076777 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-notification-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076788 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="sg-core" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076800 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="ceilometer-central-agent" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.076811 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" containerName="proxy-httpd" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.087387 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.090463 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.090686 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.091052 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.100464 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.115667 4810 scope.go:117] "RemoveContainer" containerID="ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.136571 4810 scope.go:117] "RemoveContainer" containerID="b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.164851 4810 scope.go:117] "RemoveContainer" containerID="4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.165628 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a\": container with ID starting with 4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a not found: ID does not exist" containerID="4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.165663 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a"} err="failed to get container status \"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a\": rpc error: code = NotFound desc = could not find container \"4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a\": container with ID starting with 4256e784b3121d03bbf624e7255d9541e12f175f3b4a360836cc87851b6fdd3a not found: ID does not exist" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.165688 4810 scope.go:117] "RemoveContainer" containerID="437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.165999 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a\": container with ID starting with 437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a not found: ID does not exist" containerID="437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.166027 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a"} err="failed to get container status \"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a\": rpc error: code = NotFound desc = could not find container \"437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a\": container with ID starting with 437e413cd803222398a5b8ea2d2f92044e6c295d70d34d7912bbc90288c5e91a not found: ID does not exist" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.166042 4810 scope.go:117] "RemoveContainer" containerID="ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.166290 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0\": container with ID starting with ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0 not found: ID does not exist" containerID="ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.166344 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0"} err="failed to get container status \"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0\": rpc error: code = NotFound desc = could not find container \"ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0\": container with ID starting with ba3699cfea40d515193d0067cf1417484335ee5a251670dbbcb244f3570643a0 not found: ID does not exist" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.166362 4810 scope.go:117] "RemoveContainer" containerID="b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d" Oct 03 07:18:38 crc kubenswrapper[4810]: E1003 07:18:38.166934 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d\": container with ID starting with b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d not found: ID does not exist" containerID="b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.167030 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d"} err="failed to get container status \"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d\": rpc error: code = NotFound desc = could not find container \"b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d\": container with ID starting with b826c82d46bf7ae43c8d78828c73502d338e838e46ec4d95aa969a4d33f9115d not found: ID does not exist" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.188493 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.188567 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.188608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.188966 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.189122 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.189155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbrw\" (UniqueName: \"kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.189185 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.189218 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.291881 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292040 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292087 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbrw\" (UniqueName: \"kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292256 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292278 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292914 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.292943 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.295674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.296462 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.296736 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.297369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.299368 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.310092 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbrw\" (UniqueName: \"kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw\") pod \"ceilometer-0\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.414933 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:18:38 crc kubenswrapper[4810]: I1003 07:18:38.938372 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:39 crc kubenswrapper[4810]: I1003 07:18:39.021204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerStarted","Data":"9da4e36ac16fb5121498f8d74a4d5cc871586e1d5b0c8181f6fe1665c4d89cdd"} Oct 03 07:18:39 crc kubenswrapper[4810]: I1003 07:18:39.314243 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3cc8fa-7036-4fc5-883b-e2b955e35397" path="/var/lib/kubelet/pods/ad3cc8fa-7036-4fc5-883b-e2b955e35397/volumes" Oct 03 07:18:39 crc kubenswrapper[4810]: I1003 07:18:39.327082 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 07:18:39 crc kubenswrapper[4810]: I1003 07:18:39.327115 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 07:18:39 crc kubenswrapper[4810]: I1003 07:18:39.351601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 07:18:40 crc kubenswrapper[4810]: I1003 07:18:40.040166 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerStarted","Data":"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a"} Oct 03 07:18:40 crc kubenswrapper[4810]: I1003 07:18:40.342097 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 07:18:40 crc kubenswrapper[4810]: I1003 07:18:40.342192 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 07:18:40 crc kubenswrapper[4810]: I1003 07:18:40.494674 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 07:18:41 crc kubenswrapper[4810]: I1003 07:18:41.054943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerStarted","Data":"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc"} Oct 03 07:18:41 crc kubenswrapper[4810]: I1003 07:18:41.055362 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerStarted","Data":"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c"} Oct 03 07:18:43 crc kubenswrapper[4810]: I1003 07:18:43.109643 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerStarted","Data":"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c"} Oct 03 07:18:43 crc kubenswrapper[4810]: I1003 07:18:43.110312 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:18:43 crc kubenswrapper[4810]: I1003 07:18:43.156177 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.181734784 podStartE2EDuration="5.156152352s" podCreationTimestamp="2025-10-03 07:18:38 +0000 UTC" firstStartedPulling="2025-10-03 07:18:38.936985312 +0000 UTC m=+1352.364236047" lastFinishedPulling="2025-10-03 07:18:41.91140287 +0000 UTC m=+1355.338653615" observedRunningTime="2025-10-03 07:18:43.146959846 +0000 UTC m=+1356.574210591" watchObservedRunningTime="2025-10-03 07:18:43.156152352 +0000 UTC m=+1356.583403107" Oct 03 07:18:45 crc kubenswrapper[4810]: I1003 07:18:45.397346 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:18:45 crc kubenswrapper[4810]: I1003 07:18:45.397833 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:18:45 crc kubenswrapper[4810]: I1003 07:18:45.486357 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 07:18:45 crc kubenswrapper[4810]: I1003 07:18:45.495251 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 07:18:45 crc kubenswrapper[4810]: I1003 07:18:45.541829 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 07:18:46 crc kubenswrapper[4810]: I1003 07:18:46.188345 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 07:18:46 crc kubenswrapper[4810]: I1003 07:18:46.479171 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 07:18:46 crc kubenswrapper[4810]: I1003 07:18:46.479284 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 07:18:49 crc kubenswrapper[4810]: I1003 07:18:49.331228 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 07:18:49 crc kubenswrapper[4810]: I1003 07:18:49.334927 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 07:18:49 crc kubenswrapper[4810]: I1003 07:18:49.338884 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 07:18:50 crc kubenswrapper[4810]: I1003 07:18:50.213776 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.154455 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230050 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle\") pod \"77edf7e6-7742-4af2-881f-006b87a2ce90\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230159 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4m5x\" (UniqueName: \"kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x\") pod \"77edf7e6-7742-4af2-881f-006b87a2ce90\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230377 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data\") pod \"77edf7e6-7742-4af2-881f-006b87a2ce90\" (UID: \"77edf7e6-7742-4af2-881f-006b87a2ce90\") " Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230460 4810 generic.go:334] "Generic (PLEG): container finished" podID="77edf7e6-7742-4af2-881f-006b87a2ce90" containerID="c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611" exitCode=137 Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230506 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"77edf7e6-7742-4af2-881f-006b87a2ce90","Type":"ContainerDied","Data":"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611"} Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230539 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"77edf7e6-7742-4af2-881f-006b87a2ce90","Type":"ContainerDied","Data":"f919c57f184f2a8e1684ab3be3e022616ed40d8732ca734c531a537f0bdf721d"} Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230561 4810 scope.go:117] "RemoveContainer" containerID="c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.230717 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.236169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x" (OuterVolumeSpecName: "kube-api-access-m4m5x") pod "77edf7e6-7742-4af2-881f-006b87a2ce90" (UID: "77edf7e6-7742-4af2-881f-006b87a2ce90"). InnerVolumeSpecName "kube-api-access-m4m5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.255575 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77edf7e6-7742-4af2-881f-006b87a2ce90" (UID: "77edf7e6-7742-4af2-881f-006b87a2ce90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.265783 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data" (OuterVolumeSpecName: "config-data") pod "77edf7e6-7742-4af2-881f-006b87a2ce90" (UID: "77edf7e6-7742-4af2-881f-006b87a2ce90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.329777 4810 scope.go:117] "RemoveContainer" containerID="c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611" Oct 03 07:18:53 crc kubenswrapper[4810]: E1003 07:18:53.330232 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611\": container with ID starting with c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611 not found: ID does not exist" containerID="c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.330268 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611"} err="failed to get container status \"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611\": rpc error: code = NotFound desc = could not find container \"c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611\": container with ID starting with c5329af441002c3eec1c498d10048d404b26e75492638a673865e5b2d369b611 not found: ID does not exist" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.333013 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.333048 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77edf7e6-7742-4af2-881f-006b87a2ce90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.333064 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4m5x\" (UniqueName: \"kubernetes.io/projected/77edf7e6-7742-4af2-881f-006b87a2ce90-kube-api-access-m4m5x\") on node \"crc\" DevicePath \"\"" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.562165 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.573396 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.591883 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:53 crc kubenswrapper[4810]: E1003 07:18:53.592455 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77edf7e6-7742-4af2-881f-006b87a2ce90" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.592481 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="77edf7e6-7742-4af2-881f-006b87a2ce90" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.592730 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="77edf7e6-7742-4af2-881f-006b87a2ce90" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.593557 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.596119 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.596343 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.597521 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.601219 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.739360 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtg8l\" (UniqueName: \"kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.739516 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.739702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.740062 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.740149 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.841801 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.842108 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.842200 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.842284 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtg8l\" (UniqueName: \"kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.842339 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.846462 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.847379 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.848002 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.849661 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.860187 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtg8l\" (UniqueName: \"kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:53 crc kubenswrapper[4810]: I1003 07:18:53.921646 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:54 crc kubenswrapper[4810]: I1003 07:18:54.409136 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:18:54 crc kubenswrapper[4810]: W1003 07:18:54.414409 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8d0a7a_6384_400f_91b8_3bf1cfb785a8.slice/crio-4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785 WatchSource:0}: Error finding container 4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785: Status 404 returned error can't find the container with id 4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785 Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.257479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8","Type":"ContainerStarted","Data":"6ae3cb61504efe6139088cb930100070bf14667b9b9ded1ec426e0e440d762ab"} Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.258444 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8","Type":"ContainerStarted","Data":"4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785"} Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.301809 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.301781791 podStartE2EDuration="2.301781791s" podCreationTimestamp="2025-10-03 07:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:55.287812246 +0000 UTC m=+1368.715063021" watchObservedRunningTime="2025-10-03 07:18:55.301781791 +0000 UTC m=+1368.729032526" Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.317087 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77edf7e6-7742-4af2-881f-006b87a2ce90" path="/var/lib/kubelet/pods/77edf7e6-7742-4af2-881f-006b87a2ce90/volumes" Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.402948 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.403524 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.405592 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 07:18:55 crc kubenswrapper[4810]: I1003 07:18:55.416494 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.267943 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.271501 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.471519 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.473362 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.497851 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605711 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605917 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605948 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqgrm\" (UniqueName: \"kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.605972 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.707767 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.707887 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.707958 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.707983 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgrm\" (UniqueName: \"kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.708013 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.708097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.709103 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.709636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.710547 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.710673 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.710933 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.736814 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqgrm\" (UniqueName: \"kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm\") pod \"dnsmasq-dns-59fc6f8ff7-45579\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:56 crc kubenswrapper[4810]: I1003 07:18:56.809410 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:57 crc kubenswrapper[4810]: W1003 07:18:57.313345 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod494d64ab_9bc7_4bc6_8cc4_1085679458bb.slice/crio-5798f8de169134cd8932aa65143af585f5fc92ee28445f5e4fa154e563d3f7e9 WatchSource:0}: Error finding container 5798f8de169134cd8932aa65143af585f5fc92ee28445f5e4fa154e563d3f7e9: Status 404 returned error can't find the container with id 5798f8de169134cd8932aa65143af585f5fc92ee28445f5e4fa154e563d3f7e9 Oct 03 07:18:57 crc kubenswrapper[4810]: I1003 07:18:57.313806 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.290394 4810 generic.go:334] "Generic (PLEG): container finished" podID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerID="d78be4071ad7d91cc40af56bcbd986c937fd0441f9a42d2d109631181b9647a8" exitCode=0 Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.291031 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" event={"ID":"494d64ab-9bc7-4bc6-8cc4-1085679458bb","Type":"ContainerDied","Data":"d78be4071ad7d91cc40af56bcbd986c937fd0441f9a42d2d109631181b9647a8"} Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.291086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" event={"ID":"494d64ab-9bc7-4bc6-8cc4-1085679458bb","Type":"ContainerStarted","Data":"5798f8de169134cd8932aa65143af585f5fc92ee28445f5e4fa154e563d3f7e9"} Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.353875 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.354210 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-central-agent" containerID="cri-o://37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a" gracePeriod=30 Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.354348 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="sg-core" containerID="cri-o://44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc" gracePeriod=30 Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.354408 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-notification-agent" containerID="cri-o://ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c" gracePeriod=30 Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.354529 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="proxy-httpd" containerID="cri-o://5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c" gracePeriod=30 Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.368224 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 03 07:18:58 crc kubenswrapper[4810]: I1003 07:18:58.922662 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.301030 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" event={"ID":"494d64ab-9bc7-4bc6-8cc4-1085679458bb","Type":"ContainerStarted","Data":"6b16ee8566994bdd9ecd93167180c725b80ba63ff85842614f3a3fa763f0be63"} Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.302569 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319388 4810 generic.go:334] "Generic (PLEG): container finished" podID="46f115c7-7288-4791-8425-92170f792535" containerID="5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c" exitCode=0 Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319424 4810 generic.go:334] "Generic (PLEG): container finished" podID="46f115c7-7288-4791-8425-92170f792535" containerID="44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc" exitCode=2 Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319433 4810 generic.go:334] "Generic (PLEG): container finished" podID="46f115c7-7288-4791-8425-92170f792535" containerID="37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a" exitCode=0 Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319455 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerDied","Data":"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c"} Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319481 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerDied","Data":"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc"} Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.319493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerDied","Data":"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a"} Oct 03 07:18:59 crc kubenswrapper[4810]: I1003 07:18:59.338742 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" podStartSLOduration=3.338716551 podStartE2EDuration="3.338716551s" podCreationTimestamp="2025-10-03 07:18:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:18:59.330408858 +0000 UTC m=+1372.757659613" watchObservedRunningTime="2025-10-03 07:18:59.338716551 +0000 UTC m=+1372.765967286" Oct 03 07:19:00 crc kubenswrapper[4810]: I1003 07:19:00.023260 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:00 crc kubenswrapper[4810]: I1003 07:19:00.023807 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-log" containerID="cri-o://97e7825e9f72346aba58ffc17325579724021ef5c4b09256729c000b1df3ba80" gracePeriod=30 Oct 03 07:19:00 crc kubenswrapper[4810]: I1003 07:19:00.023880 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-api" containerID="cri-o://93efb94f140bc50a949f93e2e63517e8845aa87508f6c501792f6a74a557b800" gracePeriod=30 Oct 03 07:19:00 crc kubenswrapper[4810]: I1003 07:19:00.333675 4810 generic.go:334] "Generic (PLEG): container finished" podID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerID="97e7825e9f72346aba58ffc17325579724021ef5c4b09256729c000b1df3ba80" exitCode=143 Oct 03 07:19:00 crc kubenswrapper[4810]: I1003 07:19:00.334996 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerDied","Data":"97e7825e9f72346aba58ffc17325579724021ef5c4b09256729c000b1df3ba80"} Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.274080 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.345463 4810 generic.go:334] "Generic (PLEG): container finished" podID="46f115c7-7288-4791-8425-92170f792535" containerID="ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c" exitCode=0 Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.345545 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerDied","Data":"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c"} Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.345589 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"46f115c7-7288-4791-8425-92170f792535","Type":"ContainerDied","Data":"9da4e36ac16fb5121498f8d74a4d5cc871586e1d5b0c8181f6fe1665c4d89cdd"} Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.345611 4810 scope.go:117] "RemoveContainer" containerID="5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.345770 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.369950 4810 scope.go:117] "RemoveContainer" containerID="44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.394323 4810 scope.go:117] "RemoveContainer" containerID="ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.411811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.411883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.411918 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.411964 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.412011 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.412029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.412057 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tbrw\" (UniqueName: \"kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.412083 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd\") pod \"46f115c7-7288-4791-8425-92170f792535\" (UID: \"46f115c7-7288-4791-8425-92170f792535\") " Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.412788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.414878 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.439415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw" (OuterVolumeSpecName: "kube-api-access-7tbrw") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "kube-api-access-7tbrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.439554 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts" (OuterVolumeSpecName: "scripts") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.445526 4810 scope.go:117] "RemoveContainer" containerID="37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.451719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.477811 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.498658 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.514320 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.514356 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.514368 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.514379 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.514393 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.515613 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tbrw\" (UniqueName: \"kubernetes.io/projected/46f115c7-7288-4791-8425-92170f792535-kube-api-access-7tbrw\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.515627 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/46f115c7-7288-4791-8425-92170f792535-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.522700 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data" (OuterVolumeSpecName: "config-data") pod "46f115c7-7288-4791-8425-92170f792535" (UID: "46f115c7-7288-4791-8425-92170f792535"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.586650 4810 scope.go:117] "RemoveContainer" containerID="5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.587168 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c\": container with ID starting with 5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c not found: ID does not exist" containerID="5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.587249 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c"} err="failed to get container status \"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c\": rpc error: code = NotFound desc = could not find container \"5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c\": container with ID starting with 5b00088ee6fc58934bd97ab6112e1cdd2e4184ae98b8e15a517117506314b24c not found: ID does not exist" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.587276 4810 scope.go:117] "RemoveContainer" containerID="44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.587615 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc\": container with ID starting with 44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc not found: ID does not exist" containerID="44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.587657 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc"} err="failed to get container status \"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc\": rpc error: code = NotFound desc = could not find container \"44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc\": container with ID starting with 44db82d7207f0a670090b1bc298f3783551b23d405908f3a391ebdfa87950cdc not found: ID does not exist" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.587678 4810 scope.go:117] "RemoveContainer" containerID="ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.587973 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c\": container with ID starting with ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c not found: ID does not exist" containerID="ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.588069 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c"} err="failed to get container status \"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c\": rpc error: code = NotFound desc = could not find container \"ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c\": container with ID starting with ad026ff327cbff53bbba8ec50847265574233ce96fa80ee8a98a582e7fdae99c not found: ID does not exist" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.588170 4810 scope.go:117] "RemoveContainer" containerID="37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.588460 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a\": container with ID starting with 37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a not found: ID does not exist" containerID="37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.588556 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a"} err="failed to get container status \"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a\": rpc error: code = NotFound desc = could not find container \"37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a\": container with ID starting with 37550e192c45c9bc4c2b7ad8e8b33a36be666bf29cc1c4af20acadce81294e3a not found: ID does not exist" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.617664 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46f115c7-7288-4791-8425-92170f792535-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.681054 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.689570 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.711531 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.712103 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-notification-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712124 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-notification-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.712154 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="sg-core" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712162 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="sg-core" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.712182 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-central-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712190 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-central-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: E1003 07:19:01.712203 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="proxy-httpd" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712210 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="proxy-httpd" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712434 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="sg-core" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712459 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="proxy-httpd" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712479 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-central-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.712496 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f115c7-7288-4791-8425-92170f792535" containerName="ceilometer-notification-agent" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.714572 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.717577 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.717766 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.717954 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.734908 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820770 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820939 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcttm\" (UniqueName: \"kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820955 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.820994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcttm\" (UniqueName: \"kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922796 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922844 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922877 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922916 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922937 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.922977 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.923517 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.923724 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.927966 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.928095 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.928368 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.928529 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.929375 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.940103 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcttm\" (UniqueName: \"kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm\") pod \"ceilometer-0\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " pod="openstack/ceilometer-0" Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.987005 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:01 crc kubenswrapper[4810]: I1003 07:19:01.989444 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.010724 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.038655 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.042166 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.126091 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.126222 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.126301 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58zk\" (UniqueName: \"kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.227597 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.228117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b58zk\" (UniqueName: \"kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.228165 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.228843 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.229144 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.249366 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58zk\" (UniqueName: \"kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk\") pod \"redhat-operators-rspcb\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.309173 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.537502 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.544992 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:19:02 crc kubenswrapper[4810]: I1003 07:19:02.786969 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:02 crc kubenswrapper[4810]: W1003 07:19:02.787176 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3857c647_f720_4dbf_a1d8_36c014e6c66a.slice/crio-df33bcb19083eec0db6b863f814a264e6065d9fe58db432aab71400859406eaa WatchSource:0}: Error finding container df33bcb19083eec0db6b863f814a264e6065d9fe58db432aab71400859406eaa: Status 404 returned error can't find the container with id df33bcb19083eec0db6b863f814a264e6065d9fe58db432aab71400859406eaa Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.317098 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f115c7-7288-4791-8425-92170f792535" path="/var/lib/kubelet/pods/46f115c7-7288-4791-8425-92170f792535/volumes" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.380844 4810 generic.go:334] "Generic (PLEG): container finished" podID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerID="86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b" exitCode=0 Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.381065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerDied","Data":"86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b"} Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.386423 4810 generic.go:334] "Generic (PLEG): container finished" podID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerID="93efb94f140bc50a949f93e2e63517e8845aa87508f6c501792f6a74a557b800" exitCode=0 Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.390184 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerStarted","Data":"df33bcb19083eec0db6b863f814a264e6065d9fe58db432aab71400859406eaa"} Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.390278 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerDied","Data":"93efb94f140bc50a949f93e2e63517e8845aa87508f6c501792f6a74a557b800"} Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.390638 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerStarted","Data":"438fbe9683b077741038ca7b5a4fd9bd0f7b250a5c4f67b875eb91e95925fb2e"} Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.611936 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.763380 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hzwg\" (UniqueName: \"kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg\") pod \"107024a0-3564-4f12-b62d-d77b8956a5f4\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.763540 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle\") pod \"107024a0-3564-4f12-b62d-d77b8956a5f4\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.763736 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs\") pod \"107024a0-3564-4f12-b62d-d77b8956a5f4\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.763840 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data\") pod \"107024a0-3564-4f12-b62d-d77b8956a5f4\" (UID: \"107024a0-3564-4f12-b62d-d77b8956a5f4\") " Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.765788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs" (OuterVolumeSpecName: "logs") pod "107024a0-3564-4f12-b62d-d77b8956a5f4" (UID: "107024a0-3564-4f12-b62d-d77b8956a5f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.772369 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg" (OuterVolumeSpecName: "kube-api-access-9hzwg") pod "107024a0-3564-4f12-b62d-d77b8956a5f4" (UID: "107024a0-3564-4f12-b62d-d77b8956a5f4"). InnerVolumeSpecName "kube-api-access-9hzwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.815117 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data" (OuterVolumeSpecName: "config-data") pod "107024a0-3564-4f12-b62d-d77b8956a5f4" (UID: "107024a0-3564-4f12-b62d-d77b8956a5f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.822593 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107024a0-3564-4f12-b62d-d77b8956a5f4" (UID: "107024a0-3564-4f12-b62d-d77b8956a5f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.867386 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/107024a0-3564-4f12-b62d-d77b8956a5f4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.867425 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.867440 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hzwg\" (UniqueName: \"kubernetes.io/projected/107024a0-3564-4f12-b62d-d77b8956a5f4-kube-api-access-9hzwg\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.867456 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107024a0-3564-4f12-b62d-d77b8956a5f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.924792 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:19:03 crc kubenswrapper[4810]: I1003 07:19:03.950233 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.402927 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerStarted","Data":"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e"} Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.403288 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerStarted","Data":"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db"} Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.406280 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerStarted","Data":"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016"} Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.411266 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"107024a0-3564-4f12-b62d-d77b8956a5f4","Type":"ContainerDied","Data":"0b6da545f747d11a7ee3d6163497447c6381e70932dc2cbb2758d44063d75534"} Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.411304 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.411327 4810 scope.go:117] "RemoveContainer" containerID="93efb94f140bc50a949f93e2e63517e8845aa87508f6c501792f6a74a557b800" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.438212 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.459673 4810 scope.go:117] "RemoveContainer" containerID="97e7825e9f72346aba58ffc17325579724021ef5c4b09256729c000b1df3ba80" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.468031 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.481966 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.495714 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:04 crc kubenswrapper[4810]: E1003 07:19:04.496278 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-api" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.496318 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-api" Oct 03 07:19:04 crc kubenswrapper[4810]: E1003 07:19:04.496353 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-log" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.496363 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-log" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.496602 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-api" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.496628 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" containerName="nova-api-log" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.497881 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.506079 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.506305 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.506463 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.507013 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581344 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmg8\" (UniqueName: \"kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581399 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581425 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581547 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.581564 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.682878 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.683216 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.683240 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.683345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdmg8\" (UniqueName: \"kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.683370 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.683389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.686574 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.688656 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.689434 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.690272 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.694670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.699782 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-skzrr"] Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.701281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.705654 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.710042 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.711008 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdmg8\" (UniqueName: \"kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8\") pod \"nova-api-0\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.713552 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-skzrr"] Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.785775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.786053 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q9ng\" (UniqueName: \"kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.786203 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.786768 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.840632 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.888361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.888745 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.888845 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q9ng\" (UniqueName: \"kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.888939 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.893071 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.893576 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.893934 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:04 crc kubenswrapper[4810]: I1003 07:19:04.909116 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q9ng\" (UniqueName: \"kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng\") pod \"nova-cell1-cell-mapping-skzrr\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.073398 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.318796 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107024a0-3564-4f12-b62d-d77b8956a5f4" path="/var/lib/kubelet/pods/107024a0-3564-4f12-b62d-d77b8956a5f4/volumes" Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.321741 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.425185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerStarted","Data":"a444ad2d54accffe58904f2a656e01b1c601c30117f63e6cb1f3323da22a6f63"} Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.429308 4810 generic.go:334] "Generic (PLEG): container finished" podID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerID="51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016" exitCode=0 Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.429388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerDied","Data":"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016"} Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.438010 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerStarted","Data":"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31"} Oct 03 07:19:05 crc kubenswrapper[4810]: I1003 07:19:05.574517 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-skzrr"] Oct 03 07:19:05 crc kubenswrapper[4810]: W1003 07:19:05.579044 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda20a3e4e_60fe_468c_ab50_6a39ad5c27aa.slice/crio-73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41 WatchSource:0}: Error finding container 73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41: Status 404 returned error can't find the container with id 73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41 Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.449606 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerStarted","Data":"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.451536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-skzrr" event={"ID":"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa","Type":"ContainerStarted","Data":"c08aeca318c51ae02390085c7ab9619670b1e308ec26d5e501640e18baa3cc63"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.451578 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-skzrr" event={"ID":"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa","Type":"ContainerStarted","Data":"73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.454484 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerStarted","Data":"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.454650 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-central-agent" containerID="cri-o://43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db" gracePeriod=30 Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.454955 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.455001 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="proxy-httpd" containerID="cri-o://a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426" gracePeriod=30 Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.455053 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="sg-core" containerID="cri-o://7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31" gracePeriod=30 Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.455095 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-notification-agent" containerID="cri-o://fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e" gracePeriod=30 Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.459050 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerStarted","Data":"79b07afdab08d5da04634a5da8d989266356f202e8adddeb23629a2d41d27fe2"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.459091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerStarted","Data":"6ab668baf929d6dee8fd51ef6f06d57e0cb7cd0643fa8d940a25ff858d385069"} Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.481387 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rspcb" podStartSLOduration=2.945716499 podStartE2EDuration="5.48136793s" podCreationTimestamp="2025-10-03 07:19:01 +0000 UTC" firstStartedPulling="2025-10-03 07:19:03.383440741 +0000 UTC m=+1376.810691476" lastFinishedPulling="2025-10-03 07:19:05.919092152 +0000 UTC m=+1379.346342907" observedRunningTime="2025-10-03 07:19:06.479975443 +0000 UTC m=+1379.907226168" watchObservedRunningTime="2025-10-03 07:19:06.48136793 +0000 UTC m=+1379.908618665" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.506145 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-skzrr" podStartSLOduration=2.506116675 podStartE2EDuration="2.506116675s" podCreationTimestamp="2025-10-03 07:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:19:06.493217858 +0000 UTC m=+1379.920468593" watchObservedRunningTime="2025-10-03 07:19:06.506116675 +0000 UTC m=+1379.933367410" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.531670 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.204328018 podStartE2EDuration="5.531632411s" podCreationTimestamp="2025-10-03 07:19:01 +0000 UTC" firstStartedPulling="2025-10-03 07:19:02.544764586 +0000 UTC m=+1375.972015321" lastFinishedPulling="2025-10-03 07:19:05.872068979 +0000 UTC m=+1379.299319714" observedRunningTime="2025-10-03 07:19:06.525304241 +0000 UTC m=+1379.952554976" watchObservedRunningTime="2025-10-03 07:19:06.531632411 +0000 UTC m=+1379.958883146" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.552114 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.55209143 podStartE2EDuration="2.55209143s" podCreationTimestamp="2025-10-03 07:19:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:19:06.541497826 +0000 UTC m=+1379.968748561" watchObservedRunningTime="2025-10-03 07:19:06.55209143 +0000 UTC m=+1379.979342165" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.811151 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.877238 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:19:06 crc kubenswrapper[4810]: I1003 07:19:06.877563 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="dnsmasq-dns" containerID="cri-o://3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079" gracePeriod=10 Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.433402 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.504236 4810 generic.go:334] "Generic (PLEG): container finished" podID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerID="3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079" exitCode=0 Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.504399 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.505139 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerDied","Data":"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079"} Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.505164 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" event={"ID":"39cfc321-3bad-4e79-8e28-5bd06d4f3c71","Type":"ContainerDied","Data":"b2b321fa4b8438a1ab2771db5c515d7a4c4a634ac6710dd749c812cbb1c570e3"} Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.505182 4810 scope.go:117] "RemoveContainer" containerID="3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518362 4810 generic.go:334] "Generic (PLEG): container finished" podID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerID="a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426" exitCode=0 Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518418 4810 generic.go:334] "Generic (PLEG): container finished" podID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerID="7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31" exitCode=2 Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518408 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerDied","Data":"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426"} Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518482 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerDied","Data":"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31"} Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518494 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerDied","Data":"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e"} Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.518427 4810 generic.go:334] "Generic (PLEG): container finished" podID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerID="fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e" exitCode=0 Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.535817 4810 scope.go:117] "RemoveContainer" containerID="eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561577 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561657 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5l46\" (UniqueName: \"kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561795 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561836 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.561973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb\") pod \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\" (UID: \"39cfc321-3bad-4e79-8e28-5bd06d4f3c71\") " Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.565197 4810 scope.go:117] "RemoveContainer" containerID="3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079" Oct 03 07:19:07 crc kubenswrapper[4810]: E1003 07:19:07.566148 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079\": container with ID starting with 3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079 not found: ID does not exist" containerID="3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.566184 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079"} err="failed to get container status \"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079\": rpc error: code = NotFound desc = could not find container \"3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079\": container with ID starting with 3ea42cfff714797f68a50d1f6bae268376eb44afd763db2d0c1848a1d50cd079 not found: ID does not exist" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.566204 4810 scope.go:117] "RemoveContainer" containerID="eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.568024 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46" (OuterVolumeSpecName: "kube-api-access-r5l46") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "kube-api-access-r5l46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: E1003 07:19:07.570210 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448\": container with ID starting with eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448 not found: ID does not exist" containerID="eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.570264 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448"} err="failed to get container status \"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448\": rpc error: code = NotFound desc = could not find container \"eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448\": container with ID starting with eb23dbd82d7b631b452035f47c3bb8b2c7d72c8ae3c582bad9969ac4be425448 not found: ID does not exist" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.619776 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.636382 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.643662 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.652790 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.656671 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config" (OuterVolumeSpecName: "config") pod "39cfc321-3bad-4e79-8e28-5bd06d4f3c71" (UID: "39cfc321-3bad-4e79-8e28-5bd06d4f3c71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667037 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667070 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667086 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667096 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667105 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.667115 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5l46\" (UniqueName: \"kubernetes.io/projected/39cfc321-3bad-4e79-8e28-5bd06d4f3c71-kube-api-access-r5l46\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.838057 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:19:07 crc kubenswrapper[4810]: I1003 07:19:07.850480 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57b74674bf-z2ckg"] Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.328450 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" path="/var/lib/kubelet/pods/39cfc321-3bad-4e79-8e28-5bd06d4f3c71/volumes" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.489144 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.543377 4810 generic.go:334] "Generic (PLEG): container finished" podID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerID="43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db" exitCode=0 Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.543418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerDied","Data":"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db"} Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.543443 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f641fbb2-6c4e-4355-a7fe-5f74c6a27366","Type":"ContainerDied","Data":"438fbe9683b077741038ca7b5a4fd9bd0f7b250a5c4f67b875eb91e95925fb2e"} Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.543460 4810 scope.go:117] "RemoveContainer" containerID="a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.543591 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.564247 4810 scope.go:117] "RemoveContainer" containerID="7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.582564 4810 scope.go:117] "RemoveContainer" containerID="fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.615730 4810 scope.go:117] "RemoveContainer" containerID="43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617279 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617335 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617468 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617538 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617628 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcttm\" (UniqueName: \"kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617695 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.617766 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts\") pod \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\" (UID: \"f641fbb2-6c4e-4355-a7fe-5f74c6a27366\") " Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.620599 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.620841 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.623457 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts" (OuterVolumeSpecName: "scripts") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.624449 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm" (OuterVolumeSpecName: "kube-api-access-jcttm") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "kube-api-access-jcttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.648719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.674660 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.707690 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719667 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719703 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719727 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719741 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719756 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcttm\" (UniqueName: \"kubernetes.io/projected/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-kube-api-access-jcttm\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719769 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.719781 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.746732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data" (OuterVolumeSpecName: "config-data") pod "f641fbb2-6c4e-4355-a7fe-5f74c6a27366" (UID: "f641fbb2-6c4e-4355-a7fe-5f74c6a27366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.805215 4810 scope.go:117] "RemoveContainer" containerID="a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.805722 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426\": container with ID starting with a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426 not found: ID does not exist" containerID="a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.805801 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426"} err="failed to get container status \"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426\": rpc error: code = NotFound desc = could not find container \"a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426\": container with ID starting with a8ae9997028e56b0400a97c391e28a8dad0a41ad1ad30db2ebce99d311da5426 not found: ID does not exist" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.805836 4810 scope.go:117] "RemoveContainer" containerID="7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.806184 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31\": container with ID starting with 7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31 not found: ID does not exist" containerID="7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.806309 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31"} err="failed to get container status \"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31\": rpc error: code = NotFound desc = could not find container \"7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31\": container with ID starting with 7798571667c69b6ab84cb2c95a5595188547ed9ed1940a830d56676d720a0f31 not found: ID does not exist" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.806435 4810 scope.go:117] "RemoveContainer" containerID="fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.807592 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e\": container with ID starting with fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e not found: ID does not exist" containerID="fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.807644 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e"} err="failed to get container status \"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e\": rpc error: code = NotFound desc = could not find container \"fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e\": container with ID starting with fff23a84147ba39c40be39592b4816da20539f525a2507973f968761e066c30e not found: ID does not exist" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.807669 4810 scope.go:117] "RemoveContainer" containerID="43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.807923 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db\": container with ID starting with 43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db not found: ID does not exist" containerID="43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.807946 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db"} err="failed to get container status \"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db\": rpc error: code = NotFound desc = could not find container \"43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db\": container with ID starting with 43fbc5a57ef12ba31dc4396abc6ea7d47a2d0d3ad840327a22c41cde72ab74db not found: ID does not exist" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.821462 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f641fbb2-6c4e-4355-a7fe-5f74c6a27366-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.885288 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.894927 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916026 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916403 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-notification-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916423 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-notification-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916543 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="proxy-httpd" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916551 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="proxy-httpd" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916567 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="init" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916573 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="init" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916581 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="dnsmasq-dns" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916587 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="dnsmasq-dns" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916599 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="sg-core" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916605 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="sg-core" Oct 03 07:19:09 crc kubenswrapper[4810]: E1003 07:19:09.916622 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-central-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916628 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-central-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916804 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="proxy-httpd" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916814 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-notification-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916826 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="dnsmasq-dns" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916845 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="sg-core" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.916855 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" containerName="ceilometer-central-agent" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.921402 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.923696 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.925013 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.925130 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 07:19:09 crc kubenswrapper[4810]: I1003 07:19:09.932819 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026051 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026129 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026150 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026254 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026287 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scmrn\" (UniqueName: \"kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.026753 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128105 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128195 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scmrn\" (UniqueName: \"kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128239 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128311 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128329 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128374 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.128391 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.130064 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.130065 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.132516 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.132706 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.133240 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.143848 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.144397 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.158314 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scmrn\" (UniqueName: \"kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn\") pod \"ceilometer-0\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " pod="openstack/ceilometer-0" Oct 03 07:19:10 crc kubenswrapper[4810]: I1003 07:19:10.258310 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:10.719114 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:11.315505 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f641fbb2-6c4e-4355-a7fe-5f74c6a27366" path="/var/lib/kubelet/pods/f641fbb2-6c4e-4355-a7fe-5f74c6a27366/volumes" Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:11.565594 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerStarted","Data":"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb"} Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:11.565948 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerStarted","Data":"354f6c76572defe4220c18e3144a43c5ea9902962319109cf6c16f3e4ff24a02"} Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:11.567305 4810 generic.go:334] "Generic (PLEG): container finished" podID="a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" containerID="c08aeca318c51ae02390085c7ab9619670b1e308ec26d5e501640e18baa3cc63" exitCode=0 Oct 03 07:19:11 crc kubenswrapper[4810]: I1003 07:19:11.567352 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-skzrr" event={"ID":"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa","Type":"ContainerDied","Data":"c08aeca318c51ae02390085c7ab9619670b1e308ec26d5e501640e18baa3cc63"} Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.310395 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.311426 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.352808 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57b74674bf-z2ckg" podUID="39cfc321-3bad-4e79-8e28-5bd06d4f3c71" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: i/o timeout" Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.358590 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.586472 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerStarted","Data":"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f"} Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.642558 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.701115 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:12 crc kubenswrapper[4810]: I1003 07:19:12.990719 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.082133 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts\") pod \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.082292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q9ng\" (UniqueName: \"kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng\") pod \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.082473 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data\") pod \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.082514 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle\") pod \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\" (UID: \"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa\") " Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.087311 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng" (OuterVolumeSpecName: "kube-api-access-5q9ng") pod "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" (UID: "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa"). InnerVolumeSpecName "kube-api-access-5q9ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.090141 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts" (OuterVolumeSpecName: "scripts") pod "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" (UID: "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.112028 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data" (OuterVolumeSpecName: "config-data") pod "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" (UID: "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.127565 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" (UID: "a20a3e4e-60fe-468c-ab50-6a39ad5c27aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.185235 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.185282 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.185299 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.185310 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q9ng\" (UniqueName: \"kubernetes.io/projected/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa-kube-api-access-5q9ng\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.596677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerStarted","Data":"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba"} Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.599452 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-skzrr" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.599860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-skzrr" event={"ID":"a20a3e4e-60fe-468c-ab50-6a39ad5c27aa","Type":"ContainerDied","Data":"73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41"} Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.599878 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73c0c3b8c58ddaafe251465a3d0126ba073cfb965833f22033eb9e1e8de00d41" Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.788281 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.788605 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-log" containerID="cri-o://6ab668baf929d6dee8fd51ef6f06d57e0cb7cd0643fa8d940a25ff858d385069" gracePeriod=30 Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.788746 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-api" containerID="cri-o://79b07afdab08d5da04634a5da8d989266356f202e8adddeb23629a2d41d27fe2" gracePeriod=30 Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.801009 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.801314 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerName="nova-scheduler-scheduler" containerID="cri-o://b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" gracePeriod=30 Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.856082 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.856426 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" containerID="cri-o://9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba" gracePeriod=30 Oct 03 07:19:13 crc kubenswrapper[4810]: I1003 07:19:13.856465 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" containerID="cri-o://dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62" gracePeriod=30 Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.610016 4810 generic.go:334] "Generic (PLEG): container finished" podID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerID="9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba" exitCode=143 Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.610286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerDied","Data":"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba"} Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.614117 4810 generic.go:334] "Generic (PLEG): container finished" podID="a382b38f-5949-40fe-aebf-f9f62082471c" containerID="79b07afdab08d5da04634a5da8d989266356f202e8adddeb23629a2d41d27fe2" exitCode=0 Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.614141 4810 generic.go:334] "Generic (PLEG): container finished" podID="a382b38f-5949-40fe-aebf-f9f62082471c" containerID="6ab668baf929d6dee8fd51ef6f06d57e0cb7cd0643fa8d940a25ff858d385069" exitCode=143 Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.614173 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerDied","Data":"79b07afdab08d5da04634a5da8d989266356f202e8adddeb23629a2d41d27fe2"} Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.614193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerDied","Data":"6ab668baf929d6dee8fd51ef6f06d57e0cb7cd0643fa8d940a25ff858d385069"} Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.617261 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rspcb" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="registry-server" containerID="cri-o://fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd" gracePeriod=2 Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.621026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerStarted","Data":"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349"} Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.621925 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.648176 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.359166083 podStartE2EDuration="5.648161097s" podCreationTimestamp="2025-10-03 07:19:09 +0000 UTC" firstStartedPulling="2025-10-03 07:19:10.730590014 +0000 UTC m=+1384.157840749" lastFinishedPulling="2025-10-03 07:19:14.019585028 +0000 UTC m=+1387.446835763" observedRunningTime="2025-10-03 07:19:14.644280783 +0000 UTC m=+1388.071531548" watchObservedRunningTime="2025-10-03 07:19:14.648161097 +0000 UTC m=+1388.075411832" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.733001 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.920671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.920786 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.920952 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdmg8\" (UniqueName: \"kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.921028 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.921091 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.921127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs\") pod \"a382b38f-5949-40fe-aebf-f9f62082471c\" (UID: \"a382b38f-5949-40fe-aebf-f9f62082471c\") " Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.922085 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs" (OuterVolumeSpecName: "logs") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.925737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8" (OuterVolumeSpecName: "kube-api-access-qdmg8") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "kube-api-access-qdmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.963639 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data" (OuterVolumeSpecName: "config-data") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.971340 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:14 crc kubenswrapper[4810]: I1003 07:19:14.993009 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.010556 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a382b38f-5949-40fe-aebf-f9f62082471c" (UID: "a382b38f-5949-40fe-aebf-f9f62082471c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022712 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022742 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a382b38f-5949-40fe-aebf-f9f62082471c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022751 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022760 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022768 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdmg8\" (UniqueName: \"kubernetes.io/projected/a382b38f-5949-40fe-aebf-f9f62082471c-kube-api-access-qdmg8\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.022778 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a382b38f-5949-40fe-aebf-f9f62082471c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.092677 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.225324 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content\") pod \"3857c647-f720-4dbf-a1d8-36c014e6c66a\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.225479 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities\") pod \"3857c647-f720-4dbf-a1d8-36c014e6c66a\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.225635 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b58zk\" (UniqueName: \"kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk\") pod \"3857c647-f720-4dbf-a1d8-36c014e6c66a\" (UID: \"3857c647-f720-4dbf-a1d8-36c014e6c66a\") " Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.227024 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities" (OuterVolumeSpecName: "utilities") pod "3857c647-f720-4dbf-a1d8-36c014e6c66a" (UID: "3857c647-f720-4dbf-a1d8-36c014e6c66a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.230649 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk" (OuterVolumeSpecName: "kube-api-access-b58zk") pod "3857c647-f720-4dbf-a1d8-36c014e6c66a" (UID: "3857c647-f720-4dbf-a1d8-36c014e6c66a"). InnerVolumeSpecName "kube-api-access-b58zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.326938 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3857c647-f720-4dbf-a1d8-36c014e6c66a" (UID: "3857c647-f720-4dbf-a1d8-36c014e6c66a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.328501 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.328542 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3857c647-f720-4dbf-a1d8-36c014e6c66a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.328555 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b58zk\" (UniqueName: \"kubernetes.io/projected/3857c647-f720-4dbf-a1d8-36c014e6c66a-kube-api-access-b58zk\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.496378 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.497795 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.498968 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.499055 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerName="nova-scheduler-scheduler" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.628802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a382b38f-5949-40fe-aebf-f9f62082471c","Type":"ContainerDied","Data":"a444ad2d54accffe58904f2a656e01b1c601c30117f63e6cb1f3323da22a6f63"} Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.629829 4810 scope.go:117] "RemoveContainer" containerID="79b07afdab08d5da04634a5da8d989266356f202e8adddeb23629a2d41d27fe2" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.628822 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.631592 4810 generic.go:334] "Generic (PLEG): container finished" podID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerID="fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd" exitCode=0 Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.631662 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerDied","Data":"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd"} Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.631690 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rspcb" event={"ID":"3857c647-f720-4dbf-a1d8-36c014e6c66a","Type":"ContainerDied","Data":"df33bcb19083eec0db6b863f814a264e6065d9fe58db432aab71400859406eaa"} Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.631789 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rspcb" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.656597 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.666340 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.673794 4810 scope.go:117] "RemoveContainer" containerID="6ab668baf929d6dee8fd51ef6f06d57e0cb7cd0643fa8d940a25ff858d385069" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.687601 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.703588 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rspcb"] Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.713240 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.713793 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="extract-utilities" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714024 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="extract-utilities" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.714113 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-log" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714176 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-log" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.714227 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="registry-server" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714273 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="registry-server" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.714333 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="extract-content" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714381 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="extract-content" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714154 4810 scope.go:117] "RemoveContainer" containerID="fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.714431 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" containerName="nova-manage" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" containerName="nova-manage" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.714578 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-api" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714623 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-api" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.714919 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-log" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.715019 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" containerName="registry-server" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.715117 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" containerName="nova-api-api" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.715187 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" containerName="nova-manage" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.716509 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.718360 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.719882 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.720492 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.725426 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.744220 4810 scope.go:117] "RemoveContainer" containerID="51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.764641 4810 scope.go:117] "RemoveContainer" containerID="86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.795483 4810 scope.go:117] "RemoveContainer" containerID="fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.798347 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd\": container with ID starting with fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd not found: ID does not exist" containerID="fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.798392 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd"} err="failed to get container status \"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd\": rpc error: code = NotFound desc = could not find container \"fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd\": container with ID starting with fc2212649c3cf3e8dda85f62e844db073fe7bf0f814afd5cbd59b066b09fdacd not found: ID does not exist" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.798419 4810 scope.go:117] "RemoveContainer" containerID="51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.798784 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016\": container with ID starting with 51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016 not found: ID does not exist" containerID="51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.798815 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016"} err="failed to get container status \"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016\": rpc error: code = NotFound desc = could not find container \"51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016\": container with ID starting with 51c7e46cdff27c4f4f3b815715176641f361807df2b81c00027364c765360016 not found: ID does not exist" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.798839 4810 scope.go:117] "RemoveContainer" containerID="86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b" Oct 03 07:19:15 crc kubenswrapper[4810]: E1003 07:19:15.799103 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b\": container with ID starting with 86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b not found: ID does not exist" containerID="86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.799126 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b"} err="failed to get container status \"86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b\": rpc error: code = NotFound desc = could not find container \"86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b\": container with ID starting with 86508c4e1190951e7c5074f0554920503bb1c9d9f85953ca4adb3ecf10d6516b not found: ID does not exist" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836336 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836418 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836476 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836504 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlpv6\" (UniqueName: \"kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836567 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.836597 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939018 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939142 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939204 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939292 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.939505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlpv6\" (UniqueName: \"kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.942108 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.945728 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.946483 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.947309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.947493 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:15 crc kubenswrapper[4810]: I1003 07:19:15.974400 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlpv6\" (UniqueName: \"kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6\") pod \"nova-api-0\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " pod="openstack/nova-api-0" Oct 03 07:19:16 crc kubenswrapper[4810]: I1003 07:19:16.038417 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:19:16 crc kubenswrapper[4810]: W1003 07:19:16.490010 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b7c316c_a9f6_4fa2_b150_83d6b65a9d25.slice/crio-8d373eced8c707de1a18288fed92bfc19b35ff8b730a2b74971d53b68a3d16be WatchSource:0}: Error finding container 8d373eced8c707de1a18288fed92bfc19b35ff8b730a2b74971d53b68a3d16be: Status 404 returned error can't find the container with id 8d373eced8c707de1a18288fed92bfc19b35ff8b730a2b74971d53b68a3d16be Oct 03 07:19:16 crc kubenswrapper[4810]: I1003 07:19:16.490638 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:19:16 crc kubenswrapper[4810]: I1003 07:19:16.645650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerStarted","Data":"8d373eced8c707de1a18288fed92bfc19b35ff8b730a2b74971d53b68a3d16be"} Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.005225 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:58414->10.217.0.193:8775: read: connection reset by peer" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.005290 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:58412->10.217.0.193:8775: read: connection reset by peer" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.325659 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3857c647-f720-4dbf-a1d8-36c014e6c66a" path="/var/lib/kubelet/pods/3857c647-f720-4dbf-a1d8-36c014e6c66a/volumes" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.326867 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a382b38f-5949-40fe-aebf-f9f62082471c" path="/var/lib/kubelet/pods/a382b38f-5949-40fe-aebf-f9f62082471c/volumes" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.450145 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.575771 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle\") pod \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.575853 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs\") pod \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.576067 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data\") pod \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.576128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htfz\" (UniqueName: \"kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz\") pod \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.576169 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs\") pod \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\" (UID: \"e94cb765-8d5d-4f1d-9354-93b10d89c2b5\") " Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.576265 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs" (OuterVolumeSpecName: "logs") pod "e94cb765-8d5d-4f1d-9354-93b10d89c2b5" (UID: "e94cb765-8d5d-4f1d-9354-93b10d89c2b5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.576524 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.581208 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz" (OuterVolumeSpecName: "kube-api-access-5htfz") pod "e94cb765-8d5d-4f1d-9354-93b10d89c2b5" (UID: "e94cb765-8d5d-4f1d-9354-93b10d89c2b5"). InnerVolumeSpecName "kube-api-access-5htfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.604626 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e94cb765-8d5d-4f1d-9354-93b10d89c2b5" (UID: "e94cb765-8d5d-4f1d-9354-93b10d89c2b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.624306 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data" (OuterVolumeSpecName: "config-data") pod "e94cb765-8d5d-4f1d-9354-93b10d89c2b5" (UID: "e94cb765-8d5d-4f1d-9354-93b10d89c2b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.630219 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e94cb765-8d5d-4f1d-9354-93b10d89c2b5" (UID: "e94cb765-8d5d-4f1d-9354-93b10d89c2b5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.661756 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerStarted","Data":"04a5b24413e8a224bab7a8f72e4facc8f0a0f5ecaf88b24ac5a41f3d8c47c173"} Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.661802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerStarted","Data":"d2effcc3ba0d8b3fd037a8a2e4fb2fa5cfd1c2d632b8be6e535aa804e226e280"} Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.665020 4810 generic.go:334] "Generic (PLEG): container finished" podID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerID="dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62" exitCode=0 Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.665055 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerDied","Data":"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62"} Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.665077 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e94cb765-8d5d-4f1d-9354-93b10d89c2b5","Type":"ContainerDied","Data":"7c3665e1744316b45a294248855a8848c46a7d9b62c0d8d10908c28886d936bf"} Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.665097 4810 scope.go:117] "RemoveContainer" containerID="dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.665148 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.678049 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.678085 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5htfz\" (UniqueName: \"kubernetes.io/projected/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-kube-api-access-5htfz\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.678098 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.678109 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e94cb765-8d5d-4f1d-9354-93b10d89c2b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.687961 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.687866502 podStartE2EDuration="2.687866502s" podCreationTimestamp="2025-10-03 07:19:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:19:17.680072782 +0000 UTC m=+1391.107323527" watchObservedRunningTime="2025-10-03 07:19:17.687866502 +0000 UTC m=+1391.115117237" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.706423 4810 scope.go:117] "RemoveContainer" containerID="9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.709834 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.723142 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.732590 4810 scope.go:117] "RemoveContainer" containerID="dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62" Oct 03 07:19:17 crc kubenswrapper[4810]: E1003 07:19:17.733098 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62\": container with ID starting with dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62 not found: ID does not exist" containerID="dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.733141 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62"} err="failed to get container status \"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62\": rpc error: code = NotFound desc = could not find container \"dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62\": container with ID starting with dd9d86077645acb28ce9d65b21ee552593de683d48d9a7657035410ffc377b62 not found: ID does not exist" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.733168 4810 scope.go:117] "RemoveContainer" containerID="9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba" Oct 03 07:19:17 crc kubenswrapper[4810]: E1003 07:19:17.733430 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba\": container with ID starting with 9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba not found: ID does not exist" containerID="9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.733455 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba"} err="failed to get container status \"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba\": rpc error: code = NotFound desc = could not find container \"9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba\": container with ID starting with 9b2ed6383f8b73cf903aa5b3fd8a9bb2ad59dd6c3b92b417ce3f071d9b708cba not found: ID does not exist" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.735386 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:17 crc kubenswrapper[4810]: E1003 07:19:17.735827 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.735849 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" Oct 03 07:19:17 crc kubenswrapper[4810]: E1003 07:19:17.735883 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.735909 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.742982 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-metadata" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.743017 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" containerName="nova-metadata-log" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.745091 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.766680 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.770492 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.771041 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.886978 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.887059 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.887125 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.887259 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffvrh\" (UniqueName: \"kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.887320 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994180 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffvrh\" (UniqueName: \"kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994234 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994296 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994322 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994384 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.994917 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.998768 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:17 crc kubenswrapper[4810]: I1003 07:19:17.999046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:18 crc kubenswrapper[4810]: I1003 07:19:18.002506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:18 crc kubenswrapper[4810]: I1003 07:19:18.016856 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffvrh\" (UniqueName: \"kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh\") pod \"nova-metadata-0\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " pod="openstack/nova-metadata-0" Oct 03 07:19:18 crc kubenswrapper[4810]: I1003 07:19:18.094285 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:19:18 crc kubenswrapper[4810]: I1003 07:19:18.534564 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:19:18 crc kubenswrapper[4810]: W1003 07:19:18.543038 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c746de5_0009_4522_8e8b_f7d7e0ca5fe7.slice/crio-1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082 WatchSource:0}: Error finding container 1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082: Status 404 returned error can't find the container with id 1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082 Oct 03 07:19:18 crc kubenswrapper[4810]: I1003 07:19:18.678212 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerStarted","Data":"1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082"} Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.316330 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e94cb765-8d5d-4f1d-9354-93b10d89c2b5" path="/var/lib/kubelet/pods/e94cb765-8d5d-4f1d-9354-93b10d89c2b5/volumes" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.591151 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.631966 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle\") pod \"873ef77f-c12d-4c61-bf40-7b49e6988629\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.632088 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s6bv\" (UniqueName: \"kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv\") pod \"873ef77f-c12d-4c61-bf40-7b49e6988629\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.632397 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data\") pod \"873ef77f-c12d-4c61-bf40-7b49e6988629\" (UID: \"873ef77f-c12d-4c61-bf40-7b49e6988629\") " Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.666379 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv" (OuterVolumeSpecName: "kube-api-access-5s6bv") pod "873ef77f-c12d-4c61-bf40-7b49e6988629" (UID: "873ef77f-c12d-4c61-bf40-7b49e6988629"). InnerVolumeSpecName "kube-api-access-5s6bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.672378 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "873ef77f-c12d-4c61-bf40-7b49e6988629" (UID: "873ef77f-c12d-4c61-bf40-7b49e6988629"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.674578 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data" (OuterVolumeSpecName: "config-data") pod "873ef77f-c12d-4c61-bf40-7b49e6988629" (UID: "873ef77f-c12d-4c61-bf40-7b49e6988629"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.699763 4810 generic.go:334] "Generic (PLEG): container finished" podID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" exitCode=0 Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.699816 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.699845 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"873ef77f-c12d-4c61-bf40-7b49e6988629","Type":"ContainerDied","Data":"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab"} Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.699880 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"873ef77f-c12d-4c61-bf40-7b49e6988629","Type":"ContainerDied","Data":"e2ace4166333370775b244b2af4191d6f73c93f622d8531f278a2cfd489fe5dc"} Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.699921 4810 scope.go:117] "RemoveContainer" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.704447 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerStarted","Data":"95b505fefa5eb2055f79509fe6a51262d5e6c2ddec8bdf8c70e3b8cacb9557c6"} Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.704484 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerStarted","Data":"3f1514c03f3de86269cf910c6642b971ca929d24ed8843f73f5cd456ca222515"} Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.734833 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.734869 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/873ef77f-c12d-4c61-bf40-7b49e6988629-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.734883 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s6bv\" (UniqueName: \"kubernetes.io/projected/873ef77f-c12d-4c61-bf40-7b49e6988629-kube-api-access-5s6bv\") on node \"crc\" DevicePath \"\"" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.738583 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.738559632 podStartE2EDuration="2.738559632s" podCreationTimestamp="2025-10-03 07:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:19:19.723648632 +0000 UTC m=+1393.150899367" watchObservedRunningTime="2025-10-03 07:19:19.738559632 +0000 UTC m=+1393.165810377" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.739836 4810 scope.go:117] "RemoveContainer" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" Oct 03 07:19:19 crc kubenswrapper[4810]: E1003 07:19:19.740252 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab\": container with ID starting with b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab not found: ID does not exist" containerID="b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.740275 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab"} err="failed to get container status \"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab\": rpc error: code = NotFound desc = could not find container \"b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab\": container with ID starting with b8b3f09313c461206729cbbc4a43353ac7d2995ef5e79dd535fbd0dda190b8ab not found: ID does not exist" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.772552 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.782763 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.796828 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:19 crc kubenswrapper[4810]: E1003 07:19:19.797351 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerName="nova-scheduler-scheduler" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.797377 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerName="nova-scheduler-scheduler" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.797593 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" containerName="nova-scheduler-scheduler" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.798412 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.801514 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.817249 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.937405 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.937489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldm4l\" (UniqueName: \"kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:19 crc kubenswrapper[4810]: I1003 07:19:19.937519 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.042701 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldm4l\" (UniqueName: \"kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.042995 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.043257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.048181 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.048640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.058431 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldm4l\" (UniqueName: \"kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l\") pod \"nova-scheduler-0\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.159763 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.642986 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:19:20 crc kubenswrapper[4810]: W1003 07:19:20.654232 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42b07401_8647_4f71_a3a8_ef7e2e72e171.slice/crio-04192173094e28b8ce7c54a6f854f1c13abca1be6c22df10deaeb19c99482b5e WatchSource:0}: Error finding container 04192173094e28b8ce7c54a6f854f1c13abca1be6c22df10deaeb19c99482b5e: Status 404 returned error can't find the container with id 04192173094e28b8ce7c54a6f854f1c13abca1be6c22df10deaeb19c99482b5e Oct 03 07:19:20 crc kubenswrapper[4810]: I1003 07:19:20.741601 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42b07401-8647-4f71-a3a8-ef7e2e72e171","Type":"ContainerStarted","Data":"04192173094e28b8ce7c54a6f854f1c13abca1be6c22df10deaeb19c99482b5e"} Oct 03 07:19:21 crc kubenswrapper[4810]: I1003 07:19:21.319920 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873ef77f-c12d-4c61-bf40-7b49e6988629" path="/var/lib/kubelet/pods/873ef77f-c12d-4c61-bf40-7b49e6988629/volumes" Oct 03 07:19:21 crc kubenswrapper[4810]: I1003 07:19:21.756635 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42b07401-8647-4f71-a3a8-ef7e2e72e171","Type":"ContainerStarted","Data":"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530"} Oct 03 07:19:21 crc kubenswrapper[4810]: I1003 07:19:21.787944 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.787880387 podStartE2EDuration="2.787880387s" podCreationTimestamp="2025-10-03 07:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:19:21.772982036 +0000 UTC m=+1395.200232791" watchObservedRunningTime="2025-10-03 07:19:21.787880387 +0000 UTC m=+1395.215131142" Oct 03 07:19:23 crc kubenswrapper[4810]: I1003 07:19:23.095255 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 07:19:23 crc kubenswrapper[4810]: I1003 07:19:23.095372 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 07:19:25 crc kubenswrapper[4810]: I1003 07:19:25.160322 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 07:19:26 crc kubenswrapper[4810]: I1003 07:19:26.038619 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:19:26 crc kubenswrapper[4810]: I1003 07:19:26.039191 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 07:19:27 crc kubenswrapper[4810]: I1003 07:19:27.051284 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 07:19:27 crc kubenswrapper[4810]: I1003 07:19:27.051298 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 07:19:28 crc kubenswrapper[4810]: I1003 07:19:28.095039 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 07:19:28 crc kubenswrapper[4810]: I1003 07:19:28.095571 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 07:19:29 crc kubenswrapper[4810]: I1003 07:19:29.112118 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 07:19:29 crc kubenswrapper[4810]: I1003 07:19:29.112128 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 07:19:30 crc kubenswrapper[4810]: I1003 07:19:30.160124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 07:19:30 crc kubenswrapper[4810]: I1003 07:19:30.200381 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 07:19:30 crc kubenswrapper[4810]: I1003 07:19:30.882113 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.049121 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.050568 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.052704 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.059376 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.911301 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 07:19:36 crc kubenswrapper[4810]: I1003 07:19:36.924525 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 07:19:38 crc kubenswrapper[4810]: I1003 07:19:38.102017 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 07:19:38 crc kubenswrapper[4810]: I1003 07:19:38.104128 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 07:19:38 crc kubenswrapper[4810]: I1003 07:19:38.107521 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 07:19:38 crc kubenswrapper[4810]: I1003 07:19:38.947797 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 07:19:40 crc kubenswrapper[4810]: I1003 07:19:40.272974 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.090569 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.091471 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.530698 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.531232 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" containerName="openstackclient" containerID="cri-o://b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf" gracePeriod=2 Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.555572 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.566417 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.634005 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.634394 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-hdwhb" podUID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" containerName="openstack-network-exporter" containerID="cri-o://cf1efc6c98e7db501c387b01da937aadb58092bae8f0df2fc0d91ec5c743ef15" gracePeriod=30 Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.664234 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.722096 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.722382 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" containerID="cri-o://ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" gracePeriod=30 Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.722534 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="openstack-network-exporter" containerID="cri-o://1fc2cb239867d8112861fe2339c64ebf43664be1f37eb5d32adf5e8d9a6244a0" gracePeriod=30 Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.803498 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.839288 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:02 crc kubenswrapper[4810]: E1003 07:20:02.839718 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" containerName="openstackclient" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.839730 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" containerName="openstackclient" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.839932 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" containerName="openstackclient" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.840549 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.866810 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.868072 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.887032 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.911980 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.976717 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.978001 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.998975 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgs2q\" (UniqueName: \"kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q\") pod \"cinder8097-account-delete-nn9p8\" (UID: \"28fa9089-99f4-4d50-8fe8-5f4492fe3767\") " pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:02 crc kubenswrapper[4810]: I1003 07:20:02.999130 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wg8s\" (UniqueName: \"kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s\") pod \"barbican0121-account-delete-lrqrh\" (UID: \"20859507-5b44-49b8-96f0-501d89f511c9\") " pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.001114 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.001159 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data podName:94a52a94-56b0-4dc9-9804-020a890b1fff nodeName:}" failed. No retries permitted until 2025-10-03 07:20:03.501145872 +0000 UTC m=+1436.928396607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data") pod "rabbitmq-server-0" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff") : configmap "rabbitmq-config-data" not found Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.036307 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-bpqps"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.077313 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-bpqps"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.101326 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgs2q\" (UniqueName: \"kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q\") pod \"cinder8097-account-delete-nn9p8\" (UID: \"28fa9089-99f4-4d50-8fe8-5f4492fe3767\") " pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.101418 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxl6\" (UniqueName: \"kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6\") pod \"placementc289-account-delete-v9bc4\" (UID: \"ebbf1a25-79f6-42f3-af72-ff0e33094ecd\") " pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.101482 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wg8s\" (UniqueName: \"kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s\") pod \"barbican0121-account-delete-lrqrh\" (UID: \"20859507-5b44-49b8-96f0-501d89f511c9\") " pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.139276 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dth2s"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.179095 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgs2q\" (UniqueName: \"kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q\") pod \"cinder8097-account-delete-nn9p8\" (UID: \"28fa9089-99f4-4d50-8fe8-5f4492fe3767\") " pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.204279 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wg8s\" (UniqueName: \"kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s\") pod \"barbican0121-account-delete-lrqrh\" (UID: \"20859507-5b44-49b8-96f0-501d89f511c9\") " pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.223793 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dth2s"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.268184 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxl6\" (UniqueName: \"kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6\") pod \"placementc289-account-delete-v9bc4\" (UID: \"ebbf1a25-79f6-42f3-af72-ff0e33094ecd\") " pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.380607 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hdwhb_dd1f72b6-eab9-4f83-b5ed-5507c2e79a46/openstack-network-exporter/0.log" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.380888 4810 generic.go:334] "Generic (PLEG): container finished" podID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" containerID="cf1efc6c98e7db501c387b01da937aadb58092bae8f0df2fc0d91ec5c743ef15" exitCode=2 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.404501 4810 generic.go:334] "Generic (PLEG): container finished" podID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerID="1fc2cb239867d8112861fe2339c64ebf43664be1f37eb5d32adf5e8d9a6244a0" exitCode=2 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.441624 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxl6\" (UniqueName: \"kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6\") pod \"placementc289-account-delete-v9bc4\" (UID: \"ebbf1a25-79f6-42f3-af72-ff0e33094ecd\") " pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.505190 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.505259 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data podName:94a52a94-56b0-4dc9-9804-020a890b1fff nodeName:}" failed. No retries permitted until 2025-10-03 07:20:04.505239617 +0000 UTC m=+1437.932490352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data") pod "rabbitmq-server-0" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff") : configmap "rabbitmq-config-data" not found Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.678026 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af26805-7114-468f-90a0-20e603b0d9a3" path="/var/lib/kubelet/pods/4af26805-7114-468f-90a0-20e603b0d9a3/volumes" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.693798 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51bd34d5-28c5-4d82-960c-79061d4d221a" path="/var/lib/kubelet/pods/51bd34d5-28c5-4d82-960c-79061d4d221a/volumes" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.699362 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-67bvc"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.699407 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdwhb" event={"ID":"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46","Type":"ContainerDied","Data":"cf1efc6c98e7db501c387b01da937aadb58092bae8f0df2fc0d91ec5c743ef15"} Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.699441 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-67bvc"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.699471 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerDied","Data":"1fc2cb239867d8112861fe2339c64ebf43664be1f37eb5d32adf5e8d9a6244a0"} Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.699490 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.704744 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.704780 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.704795 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.705596 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.705709 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.705737 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.733114 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.734183 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="openstack-network-exporter" containerID="cri-o://125c3a25a7189d260fedad09c0d614352a064e4e81c52065137ac3f845134c30" gracePeriod=300 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.748254 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.786002 4810 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-7rltw" message="Exiting ovn-controller (1) " Oct 03 07:20:03 crc kubenswrapper[4810]: E1003 07:20:03.786042 4810 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-7rltw" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" containerID="cri-o://ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.786075 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-7rltw" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" containerID="cri-o://ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" gracePeriod=29 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.800317 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.818729 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.824513 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="openstack-network-exporter" containerID="cri-o://e15533b44b899ab5c22bbd516df9bac595774dd7440a2dc50fd69ae2fb595cdd" gracePeriod=300 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.836718 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.838524 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.840164 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.852127 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.858377 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="ovsdbserver-sb" containerID="cri-o://00634c712dc0748b4e08ff0a87638caa3ab48eec323f9b5a7efe244d9c29c7f0" gracePeriod=300 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.867510 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mn9pm"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.878495 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mn9pm"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.888101 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-skzrr"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.889155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdwh\" (UniqueName: \"kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh\") pod \"novaapi2bc5-account-delete-8p8mv\" (UID: \"5c723fc6-aae6-4032-afba-622dd5ea4dbc\") " pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.889323 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6d89\" (UniqueName: \"kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89\") pod \"novacell03325-account-delete-sqg6n\" (UID: \"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa\") " pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.902641 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-skzrr"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.915634 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="ovsdbserver-nb" containerID="cri-o://6b1b4cc1d1b966fa22564b66bb53d584c8fb69081697fbded46dedc35eaf64df" gracePeriod=300 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.928652 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.929056 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="cinder-scheduler" containerID="cri-o://0247a4056b29a103d07032ed7a3c453d738bdaa7868ee5953410287fdcac3220" gracePeriod=30 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.929745 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="probe" containerID="cri-o://3216b2c3caf454d8a6c7df9a162014f7925ab9d4e0f9fb8999f1009a4df3da6f" gracePeriod=30 Oct 03 07:20:03 crc kubenswrapper[4810]: I1003 07:20:03.983032 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.994440 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdwh\" (UniqueName: \"kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh\") pod \"novaapi2bc5-account-delete-8p8mv\" (UID: \"5c723fc6-aae6-4032-afba-622dd5ea4dbc\") " pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.994643 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8r4x\" (UniqueName: \"kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x\") pod \"neutron073c-account-delete-9hj8g\" (UID: \"b493c984-b8a7-4a78-9210-fc44ecf70eec\") " pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.994677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6d89\" (UniqueName: \"kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89\") pod \"novacell03325-account-delete-sqg6n\" (UID: \"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa\") " pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.998704 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.998969 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75d48cc7d6-2knhz" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-log" containerID="cri-o://db3e2a840110e13b015cb2f3ec8f381cda6865452f991a0abd017a3f0c638437" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:03.999377 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-75d48cc7d6-2knhz" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-api" containerID="cri-o://1f17eff80bda95e4a3a828da788b685a51858f514dff3cf24b3bd09f553d4640" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.027244 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6d89\" (UniqueName: \"kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89\") pod \"novacell03325-account-delete-sqg6n\" (UID: \"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa\") " pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.028529 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdwh\" (UniqueName: \"kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh\") pod \"novaapi2bc5-account-delete-8p8mv\" (UID: \"5c723fc6-aae6-4032-afba-622dd5ea4dbc\") " pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.065669 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ds94g"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.079469 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ds94g"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.102743 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8r4x\" (UniqueName: \"kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x\") pod \"neutron073c-account-delete-9hj8g\" (UID: \"b493c984-b8a7-4a78-9210-fc44ecf70eec\") " pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.104584 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.104629 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data podName:37cd32da-b730-4e57-a2e0-41bf95ff8ca1 nodeName:}" failed. No retries permitted until 2025-10-03 07:20:04.604613881 +0000 UTC m=+1438.031864606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data") pod "rabbitmq-cell1-server-0" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1") : configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.117931 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xk8xp"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.130110 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.130398 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="dnsmasq-dns" containerID="cri-o://6b16ee8566994bdd9ecd93167180c725b80ba63ff85842614f3a3fa763f0be63" gracePeriod=10 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.138184 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8r4x\" (UniqueName: \"kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x\") pod \"neutron073c-account-delete-9hj8g\" (UID: \"b493c984-b8a7-4a78-9210-fc44ecf70eec\") " pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.157993 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xk8xp"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.170198 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.170418 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api-log" containerID="cri-o://bb77897c0ed3781276a2a824ff0d9bfd38c627747b91d61f75b3403cb0b2103a" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.170750 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api" containerID="cri-o://61317c73aecc47dae6a6843a4447305cfab5916eb86c694594f67fbaea6a141f" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.200061 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-c27mw"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.231052 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-c27mw"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.250772 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251244 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-updater" containerID="cri-o://46fe542f82adb9dd0f09edd022b0f4c932309d5e5f689981a10664ea462e5666" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251276 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-expirer" containerID="cri-o://895aa7e546ccbeb235b2c62051d1f3cb4581f5dca2fdbb37fa958319aba74571" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251443 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-updater" containerID="cri-o://888e7fe71210eba8464268fac89d6874d77e318e83fa8a722297b8a5dbe5627c" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251489 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-auditor" containerID="cri-o://ac5b608d8a7aa92bd915544af8c042f3b63ff7151a33ff4dbb17a4a861e889b3" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251640 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-replicator" containerID="cri-o://2fea3519604050a7d09779c938e0622ac0f41e5a7bf1dd301b0b7a60c049d624" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251744 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="swift-recon-cron" containerID="cri-o://1dc52a1d91d58d42a133fd8634f92be4368fdda5c3e244244cb985f1a0e500b4" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251805 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="rsync" containerID="cri-o://889f53c414198ba8568ddac1401c1afddd20f9d138e22cef33957e8a0c27a046" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251861 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-server" containerID="cri-o://ccf54b0b656baac5b5fa788f8029adbf50760366a62fbd65bb43a3456b3066a8" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251869 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-replicator" containerID="cri-o://45e3ddc52af834f603045aa999d0eb049c646473758731b30f7e8e2af55782fa" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251924 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-auditor" containerID="cri-o://260956333a59a33c2b6828c4e5152fe36a01a3f751272837729f275e4afc7b35" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.251969 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-server" containerID="cri-o://2b0ce3947967fa96d846a63bd3163c855b442489cfeea6683ecb041c0f9cf8f1" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.252011 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-reaper" containerID="cri-o://9974961e6abf7febdcf8e34a0c97b87742b29d09441f5766a4da92668b493130" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.252215 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-auditor" containerID="cri-o://0f9c3077ad8cecdd534fca11fbffb1bf485c180ae8d7129b25caffd483249975" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.252282 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-replicator" containerID="cri-o://a6d2ec4008f34f574e9c7b7b8a4964291b10701cd744120f6b95298162e1741c" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.252179 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-server" containerID="cri-o://bccbd5c0aebf7a8d7a2c209dbc4d28096af4feec63ee3acb6fb99edbed1f7128" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.386950 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.389246 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" containerID="cri-o://95b505fefa5eb2055f79509fe6a51262d5e6c2ddec8bdf8c70e3b8cacb9557c6" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.389853 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" containerID="cri-o://3f1514c03f3de86269cf910c6642b971ca929d24ed8843f73f5cd456ca222515" gracePeriod=30 Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.394778 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.403923 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.424198 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:04 crc kubenswrapper[4810]: E1003 07:20:04.424279 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" Oct 03 07:20:04 crc kubenswrapper[4810]: I1003 07:20:04.458444 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-94grw"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.576943 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-94grw"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.608152 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.614434 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16b4d1d-9bc5-4634-a03e-fd823db95e0d/ovsdbserver-nb/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.614468 4810 generic.go:334] "Generic (PLEG): container finished" podID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerID="e15533b44b899ab5c22bbd516df9bac595774dd7440a2dc50fd69ae2fb595cdd" exitCode=2 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.614485 4810 generic.go:334] "Generic (PLEG): container finished" podID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerID="6b1b4cc1d1b966fa22564b66bb53d584c8fb69081697fbded46dedc35eaf64df" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.614534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerDied","Data":"e15533b44b899ab5c22bbd516df9bac595774dd7440a2dc50fd69ae2fb595cdd"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.614557 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerDied","Data":"6b1b4cc1d1b966fa22564b66bb53d584c8fb69081697fbded46dedc35eaf64df"} Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.635045 4810 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 07:20:05 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 07:20:05 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNBridge=br-int Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNRemote=tcp:localhost:6642 Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNEncapType=geneve Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNAvailabilityZones= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ EnableChassisAsGateway=true Oct 03 07:20:05 crc kubenswrapper[4810]: ++ PhysicalNetworks= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNHostName= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 07:20:05 crc kubenswrapper[4810]: ++ ovs_dir=/var/lib/openvswitch Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 07:20:05 crc kubenswrapper[4810]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + cleanup_ovsdb_server_semaphore Oct 03 07:20:05 crc kubenswrapper[4810]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 07:20:05 crc kubenswrapper[4810]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-s8f2s" message=< Oct 03 07:20:05 crc kubenswrapper[4810]: Exiting ovsdb-server (5) [ OK ] Oct 03 07:20:05 crc kubenswrapper[4810]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 07:20:05 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNBridge=br-int Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNRemote=tcp:localhost:6642 Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNEncapType=geneve Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNAvailabilityZones= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ EnableChassisAsGateway=true Oct 03 07:20:05 crc kubenswrapper[4810]: ++ PhysicalNetworks= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNHostName= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 07:20:05 crc kubenswrapper[4810]: ++ ovs_dir=/var/lib/openvswitch Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 07:20:05 crc kubenswrapper[4810]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + cleanup_ovsdb_server_semaphore Oct 03 07:20:05 crc kubenswrapper[4810]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 07:20:05 crc kubenswrapper[4810]: > Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.635098 4810 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 07:20:05 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 03 07:20:05 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNBridge=br-int Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNRemote=tcp:localhost:6642 Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNEncapType=geneve Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNAvailabilityZones= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ EnableChassisAsGateway=true Oct 03 07:20:05 crc kubenswrapper[4810]: ++ PhysicalNetworks= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ OVNHostName= Oct 03 07:20:05 crc kubenswrapper[4810]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 03 07:20:05 crc kubenswrapper[4810]: ++ ovs_dir=/var/lib/openvswitch Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 03 07:20:05 crc kubenswrapper[4810]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 03 07:20:05 crc kubenswrapper[4810]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + sleep 0.5 Oct 03 07:20:05 crc kubenswrapper[4810]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 03 07:20:05 crc kubenswrapper[4810]: + cleanup_ovsdb_server_semaphore Oct 03 07:20:05 crc kubenswrapper[4810]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 03 07:20:05 crc kubenswrapper[4810]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 03 07:20:05 crc kubenswrapper[4810]: > pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" containerID="cri-o://f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.635142 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" containerID="cri-o://f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" gracePeriod=28 Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.637724 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.637969 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data podName:37cd32da-b730-4e57-a2e0-41bf95ff8ca1 nodeName:}" failed. No retries permitted until 2025-10-03 07:20:05.637950162 +0000 UTC m=+1439.065200917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data") pod "rabbitmq-cell1-server-0" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1") : configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.639190 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.639227 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data podName:94a52a94-56b0-4dc9-9804-020a890b1fff nodeName:}" failed. No retries permitted until 2025-10-03 07:20:06.639218576 +0000 UTC m=+1440.066469311 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data") pod "rabbitmq-server-0" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff") : configmap "rabbitmq-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.697106 4810 generic.go:334] "Generic (PLEG): container finished" podID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerID="6b16ee8566994bdd9ecd93167180c725b80ba63ff85842614f3a3fa763f0be63" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.700451 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" event={"ID":"494d64ab-9bc7-4bc6-8cc4-1085679458bb","Type":"ContainerDied","Data":"6b16ee8566994bdd9ecd93167180c725b80ba63ff85842614f3a3fa763f0be63"} Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.706377 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d is running failed: container process not found" containerID="ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.707886 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d is running failed: container process not found" containerID="ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.739865 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d is running failed: container process not found" containerID="ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.739939 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-7rltw" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.740159 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.749077 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.774367 4810 generic.go:334] "Generic (PLEG): container finished" podID="3e68085f-2b80-48bd-a241-70e75780e41e" containerID="db3e2a840110e13b015cb2f3ec8f381cda6865452f991a0abd017a3f0c638437" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.774463 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerDied","Data":"db3e2a840110e13b015cb2f3ec8f381cda6865452f991a0abd017a3f0c638437"} Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.774517 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:04.774549 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.787759 4810 generic.go:334] "Generic (PLEG): container finished" podID="de2b9513-b087-4498-aeef-d912e22091fb" containerID="ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.787819 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw" event={"ID":"de2b9513-b087-4498-aeef-d912e22091fb","Type":"ContainerDied","Data":"ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.789984 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hdwhb_dd1f72b6-eab9-4f83-b5ed-5507c2e79a46/openstack-network-exporter/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.790041 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-hdwhb" event={"ID":"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46","Type":"ContainerDied","Data":"385f4829872c9beb88e76f53ff9a4cc709241bebe66e13d57e9f76e1286944ee"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.790056 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="385f4829872c9beb88e76f53ff9a4cc709241bebe66e13d57e9f76e1286944ee" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.791625 4810 generic.go:334] "Generic (PLEG): container finished" podID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerID="bb77897c0ed3781276a2a824ff0d9bfd38c627747b91d61f75b3403cb0b2103a" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.791655 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerDied","Data":"bb77897c0ed3781276a2a824ff0d9bfd38c627747b91d61f75b3403cb0b2103a"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.806912 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" containerID="cri-o://444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" gracePeriod=28 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.810219 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2722ac3d-f149-4926-9b5e-cb43d477c15a/ovsdbserver-sb/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.810258 4810 generic.go:334] "Generic (PLEG): container finished" podID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerID="125c3a25a7189d260fedad09c0d614352a064e4e81c52065137ac3f845134c30" exitCode=2 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.810272 4810 generic.go:334] "Generic (PLEG): container finished" podID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerID="00634c712dc0748b4e08ff0a87638caa3ab48eec323f9b5a7efe244d9c29c7f0" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.810292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerDied","Data":"125c3a25a7189d260fedad09c0d614352a064e4e81c52065137ac3f845134c30"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.810318 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerDied","Data":"00634c712dc0748b4e08ff0a87638caa3ab48eec323f9b5a7efe244d9c29c7f0"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.831228 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0121-account-create-5kgnh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.851337 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.851556 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener-log" containerID="cri-o://116ca4730fd85c0b04786638fed840cf4e9ac1f674f214b442febeb58db8324f" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.851941 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener" containerID="cri-o://a7bf0333a5fea00209ffe7878043268796bb6130c3b253a0030e2a551d03432e" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.861616 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.862089 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-75dfb4687f-c6pvn" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker-log" containerID="cri-o://d9d3ff248266b41b739b029c6e20e9253b3a96fceeb907d9681d132dbed6d665" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.862485 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-75dfb4687f-c6pvn" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker" containerID="cri-o://c466dfa6a59d96e0d7faf89c262bdc91ffef8e54dbc15010039c1aa7f1e0a15a" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.877373 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0121-account-create-5kgnh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.886256 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.886522 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-db7b99d8d-9txpw" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api-log" containerID="cri-o://ee5aefdafa078a9fbd3fd95199dc8e09c3b1a213b866d27681b0e68a66781605" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.886791 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-db7b99d8d-9txpw" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api" containerID="cri-o://150fc8415216347ff4c833f2ebf152af21b61c3aaf0e38388292d296069c03af" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.928015 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.928275 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-log" containerID="cri-o://d2effcc3ba0d8b3fd037a8a2e4fb2fa5cfd1c2d632b8be6e535aa804e226e280" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.928758 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-api" containerID="cri-o://04a5b24413e8a224bab7a8f72e4facc8f0a0f5ecaf88b24ac5a41f3d8c47c173" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.936241 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kj8vc"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.947689 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kj8vc"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.959985 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.972149 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.972403 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-584cb866f5-5rvp9" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-api" containerID="cri-o://2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.972522 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-584cb866f5-5rvp9" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-httpd" containerID="cri-o://98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.977354 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8097-account-create-zgk9n"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:04.991717 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8097-account-create-zgk9n"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.002621 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.031087 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rtx56"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.057912 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rtx56"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.083634 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c289-account-create-pwwl5"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.093164 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c289-account-create-pwwl5"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.105438 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.122889 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-865ck"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.153735 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-865ck"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.163951 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.164331 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-log" containerID="cri-o://09c620dc8fe68cac670d9ae669f099323e4cea23f3ddbfa4a9ad6d121c674e43" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.164374 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-httpd" containerID="cri-o://49fef7a7d668fc557a8e17d62bce3a1fd7aea690b4d9b4a867e5da5faeec8731" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.179293 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6e76-account-create-29sdl"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.189717 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6e76-account-create-29sdl"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.216654 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.216929 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-log" containerID="cri-o://61b1ebe508d4c4ad6781c304ef58a02e24a10dd6ba9c40650db17b0f431011ec" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.218315 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-httpd" containerID="cri-o://ba367086c2d5f7cae40c427ef1243c3b1029c3f76d37b715b51d6a5c5d14c6c7" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.231475 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3325-account-create-b5gmg"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.237086 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" probeResult="failure" output=< Oct 03 07:20:05 crc kubenswrapper[4810]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Oct 03 07:20:05 crc kubenswrapper[4810]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Oct 03 07:20:05 crc kubenswrapper[4810]: > Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.255019 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3325-account-create-b5gmg"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.273696 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.300287 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-t2cwf"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.337222 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10621e58-9aee-4bfc-a0ad-6e4857d48a41" path="/var/lib/kubelet/pods/10621e58-9aee-4bfc-a0ad-6e4857d48a41/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.338214 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f32340d-61ad-431b-912d-60c4087cc192" path="/var/lib/kubelet/pods/1f32340d-61ad-431b-912d-60c4087cc192/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.338876 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fb793a-b8f4-4aa8-8a87-e3f3055200e6" path="/var/lib/kubelet/pods/35fb793a-b8f4-4aa8-8a87-e3f3055200e6/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.339581 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fcb22be-92b4-425f-ba84-5c2fac3cd183" path="/var/lib/kubelet/pods/3fcb22be-92b4-425f-ba84-5c2fac3cd183/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.344169 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4620f70b-3da3-4486-a8d7-1c784246d780" path="/var/lib/kubelet/pods/4620f70b-3da3-4486-a8d7-1c784246d780/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.345262 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a0b6f27-8285-42ab-99ee-35c1b6370602" path="/var/lib/kubelet/pods/5a0b6f27-8285-42ab-99ee-35c1b6370602/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.345890 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d076331-4162-4833-a443-d5060885eaa3" path="/var/lib/kubelet/pods/6d076331-4162-4833-a443-d5060885eaa3/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.346563 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81605a74-99a2-4c21-8bac-8c4b64c2164f" path="/var/lib/kubelet/pods/81605a74-99a2-4c21-8bac-8c4b64c2164f/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.347884 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20a3e4e-60fe-468c-ab50-6a39ad5c27aa" path="/var/lib/kubelet/pods/a20a3e4e-60fe-468c-ab50-6a39ad5c27aa/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.348563 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adfe75ef-eb52-4d1d-8164-716ea23ead8d" path="/var/lib/kubelet/pods/adfe75ef-eb52-4d1d-8164-716ea23ead8d/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.349575 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6" path="/var/lib/kubelet/pods/b1e2a4eb-2b12-42dd-9de9-9ef3206b10f6/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.351512 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9581fa8-c37c-4695-8e8f-34e0213752dd" path="/var/lib/kubelet/pods/c9581fa8-c37c-4695-8e8f-34e0213752dd/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.352350 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e40b3794-f1d5-4eab-ae27-c23421053e40" path="/var/lib/kubelet/pods/e40b3794-f1d5-4eab-ae27-c23421053e40/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.354888 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb3dbca9-c4b9-4f3a-b47a-03a50b9af139" path="/var/lib/kubelet/pods/eb3dbca9-c4b9-4f3a-b47a-03a50b9af139/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.355481 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53ec84c-a2d7-4f16-9222-1e3ed0cfe675" path="/var/lib/kubelet/pods/f53ec84c-a2d7-4f16-9222-1e3ed0cfe675/volumes" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.357380 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-t2cwf"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.357491 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9kkxh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.357551 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9kkxh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.364840 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.365124 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6ae3cb61504efe6139088cb930100070bf14667b9b9ded1ec426e0e440d762ab" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.371801 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2b97-account-create-cbnjn"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.376993 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="galera" containerID="cri-o://7b392a0073cffbdd3820c8d247e7f2c85594c4c28bdec1e60745b838785fde2a" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.378922 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2b97-account-create-cbnjn"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.386433 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hg4bz"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.394475 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2bc5-account-create-zflhm"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.403049 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.409468 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hg4bz"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.418071 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2bc5-account-create-zflhm"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.424801 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.450190 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.462328 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.504392 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2llk2"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.521093 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2llk2"] Oct 03 07:20:05 crc kubenswrapper[4810]: W1003 07:20:05.538303 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20859507_5b44_49b8_96f0_501d89f511c9.slice/crio-57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa WatchSource:0}: Error finding container 57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa: Status 404 returned error can't find the container with id 57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.548495 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.558652 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-hdwhb_dd1f72b6-eab9-4f83-b5ed-5507c2e79a46/openstack-network-exporter/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.558764 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.577199 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-073c-account-create-nkrq2"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.577557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.578355 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.578410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.578480 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87fg\" (UniqueName: \"kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.578557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.578657 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir\") pod \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\" (UID: \"dd1f72b6-eab9-4f83-b5ed-5507c2e79a46\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.579811 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.579858 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.580686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config" (OuterVolumeSpecName: "config") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.584340 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.599065 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2722ac3d-f149-4926-9b5e-cb43d477c15a/ovsdbserver-sb/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.599182 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.614486 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg" (OuterVolumeSpecName: "kube-api-access-l87fg") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "kube-api-access-l87fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.620659 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.632246 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16b4d1d-9bc5-4634-a03e-fd823db95e0d/ovsdbserver-nb/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.632316 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.633000 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-073c-account-create-nkrq2"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.641448 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.665232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.671072 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.671287 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerName="nova-scheduler-scheduler" containerID="cri-o://8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" gracePeriod=30 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.680280 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.680956 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681015 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681060 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681078 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681103 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681149 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681207 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681236 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681406 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681476 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681505 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681550 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681595 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681620 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwp92\" (UniqueName: \"kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681657 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681686 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681726 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681755 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681776 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn\") pod \"de2b9513-b087-4498-aeef-d912e22091fb\" (UID: \"de2b9513-b087-4498-aeef-d912e22091fb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.681869 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682000 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcnzn\" (UniqueName: \"kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682036 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682101 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqgrm\" (UniqueName: \"kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm\") pod \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\" (UID: \"494d64ab-9bc7-4bc6-8cc4-1085679458bb\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682133 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d6g2\" (UniqueName: \"kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682165 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts\") pod \"2722ac3d-f149-4926-9b5e-cb43d477c15a\" (UID: \"2722ac3d-f149-4926-9b5e-cb43d477c15a\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.682217 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\" (UID: \"e16b4d1d-9bc5-4634-a03e-fd823db95e0d\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.683544 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.683562 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.683572 4810 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.683582 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l87fg\" (UniqueName: \"kubernetes.io/projected/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-kube-api-access-l87fg\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:05.683710 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: E1003 07:20:05.683775 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data podName:37cd32da-b730-4e57-a2e0-41bf95ff8ca1 nodeName:}" failed. No retries permitted until 2025-10-03 07:20:07.683756442 +0000 UTC m=+1441.111007177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data") pod "rabbitmq-cell1-server-0" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1") : configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.685231 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.686465 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config" (OuterVolumeSpecName: "config") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.687373 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.689107 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts" (OuterVolumeSpecName: "scripts") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.702184 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts" (OuterVolumeSpecName: "scripts") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.707514 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.716877 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run" (OuterVolumeSpecName: "var-run") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.717689 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config" (OuterVolumeSpecName: "config") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.717732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.719165 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts" (OuterVolumeSpecName: "scripts") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.722303 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.740125 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2" (OuterVolumeSpecName: "kube-api-access-2d6g2") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "kube-api-access-2d6g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.770966 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.785968 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret\") pod \"b3a80a81-d436-4fa8-b5b7-560348449df3\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786028 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle\") pod \"b3a80a81-d436-4fa8-b5b7-560348449df3\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786590 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config\") pod \"b3a80a81-d436-4fa8-b5b7-560348449df3\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786619 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gfd7\" (UniqueName: \"kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7\") pod \"b3a80a81-d436-4fa8-b5b7-560348449df3\" (UID: \"b3a80a81-d436-4fa8-b5b7-560348449df3\") " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786938 4810 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786949 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786958 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786967 4810 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786975 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786985 4810 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/de2b9513-b087-4498-aeef-d912e22091fb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.786995 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d6g2\" (UniqueName: \"kubernetes.io/projected/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-kube-api-access-2d6g2\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.787004 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2722ac3d-f149-4926-9b5e-cb43d477c15a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.787013 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de2b9513-b087-4498-aeef-d912e22091fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.787021 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.787029 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.796468 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm" (OuterVolumeSpecName: "kube-api-access-cqgrm") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "kube-api-access-cqgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.820936 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.832835 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn" (OuterVolumeSpecName: "kube-api-access-mcnzn") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "kube-api-access-mcnzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.837522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.847236 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.847375 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92" (OuterVolumeSpecName: "kube-api-access-qwp92") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "kube-api-access-qwp92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.847418 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.849183 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7" (OuterVolumeSpecName: "kube-api-access-5gfd7") pod "b3a80a81-d436-4fa8-b5b7-560348449df3" (UID: "b3a80a81-d436-4fa8-b5b7-560348449df3"). InnerVolumeSpecName "kube-api-access-5gfd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.852749 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0121-account-delete-lrqrh" event={"ID":"20859507-5b44-49b8-96f0-501d89f511c9","Type":"ContainerStarted","Data":"57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.856344 4810 generic.go:334] "Generic (PLEG): container finished" podID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerID="116ca4730fd85c0b04786638fed840cf4e9ac1f674f214b442febeb58db8324f" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.856384 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerDied","Data":"116ca4730fd85c0b04786638fed840cf4e9ac1f674f214b442febeb58db8324f"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.857951 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerID="98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.857987 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerDied","Data":"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.860142 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e16b4d1d-9bc5-4634-a03e-fd823db95e0d/ovsdbserver-nb/0.log" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.860197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e16b4d1d-9bc5-4634-a03e-fd823db95e0d","Type":"ContainerDied","Data":"c8d7d5c5deeb27f1069124a7085ee4ba7de677aa6ec0b1d49baaac5a3d796c56"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.860223 4810 scope.go:117] "RemoveContainer" containerID="e15533b44b899ab5c22bbd516df9bac595774dd7440a2dc50fd69ae2fb595cdd" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.860345 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.874604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7rltw" event={"ID":"de2b9513-b087-4498-aeef-d912e22091fb","Type":"ContainerDied","Data":"8ba11fa7b0b6e74c92907035953e3b3b230b40f08f4fbeaa771ee5694cb8423f"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.874706 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7rltw" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.881349 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder8097-account-delete-nn9p8" event={"ID":"28fa9089-99f4-4d50-8fe8-5f4492fe3767","Type":"ContainerStarted","Data":"db6e0d4f23e047950821868b65b55faa7b46962691cf8c1ad44e08922a8fbcf1"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.898966 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.898998 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcnzn\" (UniqueName: \"kubernetes.io/projected/2722ac3d-f149-4926-9b5e-cb43d477c15a-kube-api-access-mcnzn\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.899009 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqgrm\" (UniqueName: \"kubernetes.io/projected/494d64ab-9bc7-4bc6-8cc4-1085679458bb-kube-api-access-cqgrm\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.899023 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.899035 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.899045 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gfd7\" (UniqueName: \"kubernetes.io/projected/b3a80a81-d436-4fa8-b5b7-560348449df3-kube-api-access-5gfd7\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.899053 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwp92\" (UniqueName: \"kubernetes.io/projected/de2b9513-b087-4498-aeef-d912e22091fb-kube-api-access-qwp92\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.940322 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.947698 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.953847 4810 scope.go:117] "RemoveContainer" containerID="6b1b4cc1d1b966fa22564b66bb53d584c8fb69081697fbded46dedc35eaf64df" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.960827 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="rabbitmq" containerID="cri-o://35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f" gracePeriod=604800 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971740 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="889f53c414198ba8568ddac1401c1afddd20f9d138e22cef33957e8a0c27a046" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971775 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="895aa7e546ccbeb235b2c62051d1f3cb4581f5dca2fdbb37fa958319aba74571" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971784 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="888e7fe71210eba8464268fac89d6874d77e318e83fa8a722297b8a5dbe5627c" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971794 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="ac5b608d8a7aa92bd915544af8c042f3b63ff7151a33ff4dbb17a4a861e889b3" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971801 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="2fea3519604050a7d09779c938e0622ac0f41e5a7bf1dd301b0b7a60c049d624" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971810 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="ccf54b0b656baac5b5fa788f8029adbf50760366a62fbd65bb43a3456b3066a8" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971818 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="46fe542f82adb9dd0f09edd022b0f4c932309d5e5f689981a10664ea462e5666" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971827 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="260956333a59a33c2b6828c4e5152fe36a01a3f751272837729f275e4afc7b35" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971834 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="45e3ddc52af834f603045aa999d0eb049c646473758731b30f7e8e2af55782fa" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971843 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="2b0ce3947967fa96d846a63bd3163c855b442489cfeea6683ecb041c0f9cf8f1" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971850 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="9974961e6abf7febdcf8e34a0c97b87742b29d09441f5766a4da92668b493130" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971857 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="0f9c3077ad8cecdd534fca11fbffb1bf485c180ae8d7129b25caffd483249975" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971863 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="a6d2ec4008f34f574e9c7b7b8a4964291b10701cd744120f6b95298162e1741c" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971870 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="bccbd5c0aebf7a8d7a2c209dbc4d28096af4feec63ee3acb6fb99edbed1f7128" exitCode=0 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"889f53c414198ba8568ddac1401c1afddd20f9d138e22cef33957e8a0c27a046"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971957 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"895aa7e546ccbeb235b2c62051d1f3cb4581f5dca2fdbb37fa958319aba74571"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971969 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"888e7fe71210eba8464268fac89d6874d77e318e83fa8a722297b8a5dbe5627c"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971980 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"ac5b608d8a7aa92bd915544af8c042f3b63ff7151a33ff4dbb17a4a861e889b3"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.971990 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"2fea3519604050a7d09779c938e0622ac0f41e5a7bf1dd301b0b7a60c049d624"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972000 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"ccf54b0b656baac5b5fa788f8029adbf50760366a62fbd65bb43a3456b3066a8"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"46fe542f82adb9dd0f09edd022b0f4c932309d5e5f689981a10664ea462e5666"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972017 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"260956333a59a33c2b6828c4e5152fe36a01a3f751272837729f275e4afc7b35"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972025 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"45e3ddc52af834f603045aa999d0eb049c646473758731b30f7e8e2af55782fa"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972033 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"2b0ce3947967fa96d846a63bd3163c855b442489cfeea6683ecb041c0f9cf8f1"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972041 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"9974961e6abf7febdcf8e34a0c97b87742b29d09441f5766a4da92668b493130"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"0f9c3077ad8cecdd534fca11fbffb1bf485c180ae8d7129b25caffd483249975"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972057 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"a6d2ec4008f34f574e9c7b7b8a4964291b10701cd744120f6b95298162e1741c"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.972065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"bccbd5c0aebf7a8d7a2c209dbc4d28096af4feec63ee3acb6fb99edbed1f7128"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.978819 4810 generic.go:334] "Generic (PLEG): container finished" podID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerID="61b1ebe508d4c4ad6781c304ef58a02e24a10dd6ba9c40650db17b0f431011ec" exitCode=143 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.978865 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerDied","Data":"61b1ebe508d4c4ad6781c304ef58a02e24a10dd6ba9c40650db17b0f431011ec"} Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.984372 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="rabbitmq" containerID="cri-o://7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec" gracePeriod=604800 Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.986672 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" (UID: "dd1f72b6-eab9-4f83-b5ed-5507c2e79a46"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:05 crc kubenswrapper[4810]: I1003 07:20:05.992452 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t8rjv"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:05.996349 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:05.996804 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fc6f8ff7-45579" event={"ID":"494d64ab-9bc7-4bc6-8cc4-1085679458bb","Type":"ContainerDied","Data":"5798f8de169134cd8932aa65143af585f5fc92ee28445f5e4fa154e563d3f7e9"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.013807 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.013839 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.013850 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.041586 4810 generic.go:334] "Generic (PLEG): container finished" podID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerID="09c620dc8fe68cac670d9ae669f099323e4cea23f3ddbfa4a9ad6d121c674e43" exitCode=143 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.041676 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerDied","Data":"09c620dc8fe68cac670d9ae669f099323e4cea23f3ddbfa4a9ad6d121c674e43"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.045954 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.046190 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" containerName="nova-cell0-conductor-conductor" containerID="cri-o://58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" gracePeriod=30 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.063053 4810 scope.go:117] "RemoveContainer" containerID="ca9a05b79537a38f25a00f593e55190a4e5e7341d7b2b443d1e9c90a380cd91d" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.097317 4810 generic.go:334] "Generic (PLEG): container finished" podID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerID="3f1514c03f3de86269cf910c6642b971ca929d24ed8843f73f5cd456ca222515" exitCode=143 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.097423 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerDied","Data":"3f1514c03f3de86269cf910c6642b971ca929d24ed8843f73f5cd456ca222515"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.120944 4810 generic.go:334] "Generic (PLEG): container finished" podID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerID="d2effcc3ba0d8b3fd037a8a2e4fb2fa5cfd1c2d632b8be6e535aa804e226e280" exitCode=143 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.121060 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerDied","Data":"d2effcc3ba0d8b3fd037a8a2e4fb2fa5cfd1c2d632b8be6e535aa804e226e280"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.147191 4810 generic.go:334] "Generic (PLEG): container finished" podID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" exitCode=0 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.147263 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerDied","Data":"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.150553 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t8rjv"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.156110 4810 generic.go:334] "Generic (PLEG): container finished" podID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerID="ee5aefdafa078a9fbd3fd95199dc8e09c3b1a213b866d27681b0e68a66781605" exitCode=143 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.156178 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerDied","Data":"ee5aefdafa078a9fbd3fd95199dc8e09c3b1a213b866d27681b0e68a66781605"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.158868 4810 generic.go:334] "Generic (PLEG): container finished" podID="b3a80a81-d436-4fa8-b5b7-560348449df3" containerID="b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf" exitCode=137 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.159020 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.186079 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.186284 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerName="nova-cell1-conductor-conductor" containerID="cri-o://83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" gracePeriod=30 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.206603 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerID="3216b2c3caf454d8a6c7df9a162014f7925ab9d4e0f9fb8999f1009a4df3da6f" exitCode=0 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.206687 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerDied","Data":"3216b2c3caf454d8a6c7df9a162014f7925ab9d4e0f9fb8999f1009a4df3da6f"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.278007 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dcgw8"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.288614 4810 scope.go:117] "RemoveContainer" containerID="6b16ee8566994bdd9ecd93167180c725b80ba63ff85842614f3a3fa763f0be63" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.325398 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerID="d9d3ff248266b41b739b029c6e20e9253b3a96fceeb907d9681d132dbed6d665" exitCode=143 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.326226 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dcgw8"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.326333 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerDied","Data":"d9d3ff248266b41b739b029c6e20e9253b3a96fceeb907d9681d132dbed6d665"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.343545 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.343785 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-766c6867bf-vjpbj" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-httpd" containerID="cri-o://d1f756ac00c5c1c73d319dfc9ee7cf96f32a3e5421d41971071b05ee6e0f4c9e" gracePeriod=30 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.344386 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-766c6867bf-vjpbj" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-server" containerID="cri-o://3f01201fc73b718d0aea9571880ce1f2a30e6b5a5955dcbaa7b43e5aedc4fe65" gracePeriod=30 Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.356754 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.374734 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.382626 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2722ac3d-f149-4926-9b5e-cb43d477c15a/ovsdbserver-sb/0.log" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.383026 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-hdwhb" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.402457 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2722ac3d-f149-4926-9b5e-cb43d477c15a","Type":"ContainerDied","Data":"995fd8cea6e7fd1d4a637b0f857271a5cdacc40d36a7b64e6d825b969aa66fd8"} Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.402604 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.451616 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.507247 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.554341 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.603739 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.658963 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:06 crc kubenswrapper[4810]: E1003 07:20:06.663868 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 07:20:06 crc kubenswrapper[4810]: E1003 07:20:06.664026 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data podName:94a52a94-56b0-4dc9-9804-020a890b1fff nodeName:}" failed. No retries permitted until 2025-10-03 07:20:10.663993301 +0000 UTC m=+1444.091244216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data") pod "rabbitmq-server-0" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff") : configmap "rabbitmq-config-data" not found Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.695199 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.725294 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-766c6867bf-vjpbj" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.161:8080/healthcheck\": read tcp 10.217.0.2:51572->10.217.0.161:8080: read: connection reset by peer" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.725331 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-766c6867bf-vjpbj" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.161:8080/healthcheck\": read tcp 10.217.0.2:51570->10.217.0.161:8080: read: connection reset by peer" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.765411 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.845128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3a80a81-d436-4fa8-b5b7-560348449df3" (UID: "b3a80a81-d436-4fa8-b5b7-560348449df3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.870502 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.883552 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b3a80a81-d436-4fa8-b5b7-560348449df3" (UID: "b3a80a81-d436-4fa8-b5b7-560348449df3"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.883675 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.888646 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.931255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.932748 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.970318 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b3a80a81-d436-4fa8-b5b7-560348449df3" (UID: "b3a80a81-d436-4fa8-b5b7-560348449df3"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.971003 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "de2b9513-b087-4498-aeef-d912e22091fb" (UID: "de2b9513-b087-4498-aeef-d912e22091fb"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972523 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972541 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b3a80a81-d436-4fa8-b5b7-560348449df3-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972550 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972559 4810 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972568 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972577 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/de2b9513-b087-4498-aeef-d912e22091fb-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.972586 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:06 crc kubenswrapper[4810]: I1003 07:20:06.993730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config" (OuterVolumeSpecName: "config") pod "494d64ab-9bc7-4bc6-8cc4-1085679458bb" (UID: "494d64ab-9bc7-4bc6-8cc4-1085679458bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.035372 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.077623 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/494d64ab-9bc7-4bc6-8cc4-1085679458bb-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.077657 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.085034 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.091142 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2722ac3d-f149-4926-9b5e-cb43d477c15a" (UID: "2722ac3d-f149-4926-9b5e-cb43d477c15a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.109491 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e16b4d1d-9bc5-4634-a03e-fd823db95e0d" (UID: "e16b4d1d-9bc5-4634-a03e-fd823db95e0d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.179663 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.179694 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e16b4d1d-9bc5-4634-a03e-fd823db95e0d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.179704 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2722ac3d-f149-4926-9b5e-cb43d477c15a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.421929 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbb50cd-f5bf-48b4-95a9-0f0f42725c27" path="/var/lib/kubelet/pods/0dbb50cd-f5bf-48b4-95a9-0f0f42725c27/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.422725 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edd30f8-0847-4961-8f40-399967851ebe" path="/var/lib/kubelet/pods/0edd30f8-0847-4961-8f40-399967851ebe/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.423235 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34781cb2-747c-4d91-b346-725bb9b63394" path="/var/lib/kubelet/pods/34781cb2-747c-4d91-b346-725bb9b63394/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.424226 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3605c50a-f081-4145-a88c-4c83b7541db1" path="/var/lib/kubelet/pods/3605c50a-f081-4145-a88c-4c83b7541db1/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.425209 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384a7a09-09bc-4322-b41c-073344b351ef" path="/var/lib/kubelet/pods/384a7a09-09bc-4322-b41c-073344b351ef/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.425658 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7e8bca-49ef-4f7b-b962-a215f9a42605" path="/var/lib/kubelet/pods/7d7e8bca-49ef-4f7b-b962-a215f9a42605/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.426182 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a80a81-d436-4fa8-b5b7-560348449df3" path="/var/lib/kubelet/pods/b3a80a81-d436-4fa8-b5b7-560348449df3/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.427578 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c254e046-e923-4f68-8d96-e098b4ba354e" path="/var/lib/kubelet/pods/c254e046-e923-4f68-8d96-e098b4ba354e/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.428051 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd0e820-b591-4d7d-94d4-ccf6ae82ddab" path="/var/lib/kubelet/pods/dcd0e820-b591-4d7d-94d4-ccf6ae82ddab/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.428486 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f135c1f3-8f57-4fba-b0a3-b83ef524acfd" path="/var/lib/kubelet/pods/f135c1f3-8f57-4fba-b0a3-b83ef524acfd/volumes" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.443417 4810 generic.go:334] "Generic (PLEG): container finished" podID="e0b20ced-eb2f-41a3-8988-611615af5759" containerID="3f01201fc73b718d0aea9571880ce1f2a30e6b5a5955dcbaa7b43e5aedc4fe65" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.443450 4810 generic.go:334] "Generic (PLEG): container finished" podID="e0b20ced-eb2f-41a3-8988-611615af5759" containerID="d1f756ac00c5c1c73d319dfc9ee7cf96f32a3e5421d41971071b05ee6e0f4c9e" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.444566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerDied","Data":"3f01201fc73b718d0aea9571880ce1f2a30e6b5a5955dcbaa7b43e5aedc4fe65"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.444627 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerDied","Data":"d1f756ac00c5c1c73d319dfc9ee7cf96f32a3e5421d41971071b05ee6e0f4c9e"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.459126 4810 generic.go:334] "Generic (PLEG): container finished" podID="20859507-5b44-49b8-96f0-501d89f511c9" containerID="a08158729ce0fad165b64b76d5d8b70aee5356d571f7f8136158efbbc91849ec" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.459225 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0121-account-delete-lrqrh" event={"ID":"20859507-5b44-49b8-96f0-501d89f511c9","Type":"ContainerDied","Data":"a08158729ce0fad165b64b76d5d8b70aee5356d571f7f8136158efbbc91849ec"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.465452 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell03325-account-delete-sqg6n" event={"ID":"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa","Type":"ContainerStarted","Data":"d9acb09122787a0f17af4564d93b3d0b7df17919e76fea1f1bc9b0f1e57ba686"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.467122 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" containerID="6ae3cb61504efe6139088cb930100070bf14667b9b9ded1ec426e0e440d762ab" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.467238 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8","Type":"ContainerDied","Data":"6ae3cb61504efe6139088cb930100070bf14667b9b9ded1ec426e0e440d762ab"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.467311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8","Type":"ContainerDied","Data":"4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.467366 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dbe94348df55c02ba2f74f71e17f551f089595d41e75054e06d83700d385785" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.531578 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerID="0247a4056b29a103d07032ed7a3c453d738bdaa7868ee5953410287fdcac3220" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.531681 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerDied","Data":"0247a4056b29a103d07032ed7a3c453d738bdaa7868ee5953410287fdcac3220"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.538369 4810 generic.go:334] "Generic (PLEG): container finished" podID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerID="c466dfa6a59d96e0d7faf89c262bdc91ffef8e54dbc15010039c1aa7f1e0a15a" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.538427 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerDied","Data":"c466dfa6a59d96e0d7faf89c262bdc91ffef8e54dbc15010039c1aa7f1e0a15a"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.538453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-75dfb4687f-c6pvn" event={"ID":"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef","Type":"ContainerDied","Data":"024c48216f5fe8ff3dc58959f6971cecd1b851179685ac4bdb0d43c86358ed68"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.538465 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="024c48216f5fe8ff3dc58959f6971cecd1b851179685ac4bdb0d43c86358ed68" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.544731 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi2bc5-account-delete-8p8mv" event={"ID":"5c723fc6-aae6-4032-afba-622dd5ea4dbc","Type":"ContainerStarted","Data":"b8fd1274c7bb002b7eed16cc22bdd6be3e2781e36e837889d4aa6576954dd4ab"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.554715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron073c-account-delete-9hj8g" event={"ID":"b493c984-b8a7-4a78-9210-fc44ecf70eec","Type":"ContainerStarted","Data":"2d9b2d7c30b0760c100de9cf8f24641b9104eec90f95585773f024e21b1479e7"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.557422 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementc289-account-delete-v9bc4" event={"ID":"ebbf1a25-79f6-42f3-af72-ff0e33094ecd","Type":"ContainerStarted","Data":"bc190f56838f6c3c55e6bb6b56769d1b19120c7d476eb7b9d11ea647daed1e3f"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.560921 4810 generic.go:334] "Generic (PLEG): container finished" podID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerID="a7bf0333a5fea00209ffe7878043268796bb6130c3b253a0030e2a551d03432e" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.561068 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerDied","Data":"a7bf0333a5fea00209ffe7878043268796bb6130c3b253a0030e2a551d03432e"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.562748 4810 generic.go:334] "Generic (PLEG): container finished" podID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerID="7b392a0073cffbdd3820c8d247e7f2c85594c4c28bdec1e60745b838785fde2a" exitCode=0 Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.562828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerDied","Data":"7b392a0073cffbdd3820c8d247e7f2c85594c4c28bdec1e60745b838785fde2a"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.562916 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5d666cf5-d522-45c6-8fbf-13da1289394a","Type":"ContainerDied","Data":"f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1"} Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.563009 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f42b103f67de3b7d6b1c28516a60543e4ee28eb8a9264e817e659eb4e27db3d1" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.657399 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": read tcp 10.217.0.2:45380->10.217.0.173:8776: read: connection reset by peer" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.681071 4810 scope.go:117] "RemoveContainer" containerID="d78be4071ad7d91cc40af56bcbd986c937fd0441f9a42d2d109631181b9647a8" Oct 03 07:20:07 crc kubenswrapper[4810]: E1003 07:20:07.699072 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:07 crc kubenswrapper[4810]: E1003 07:20:07.699144 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data podName:37cd32da-b730-4e57-a2e0-41bf95ff8ca1 nodeName:}" failed. No retries permitted until 2025-10-03 07:20:11.699128545 +0000 UTC m=+1445.126379280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data") pod "rabbitmq-cell1-server-0" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1") : configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.912310 4810 scope.go:117] "RemoveContainer" containerID="b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.934246 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:20:07 crc kubenswrapper[4810]: I1003 07:20:07.998376 4810 scope.go:117] "RemoveContainer" containerID="b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf" Oct 03 07:20:08 crc kubenswrapper[4810]: E1003 07:20:07.999249 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf\": container with ID starting with b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf not found: ID does not exist" containerID="b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:07.999308 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf"} err="failed to get container status \"b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf\": rpc error: code = NotFound desc = could not find container \"b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf\": container with ID starting with b4e5fd8689e827d6b22fecf6e3b5ddeb0a0692304491af41779c7249674a7faf not found: ID does not exist" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:07.999328 4810 scope.go:117] "RemoveContainer" containerID="125c3a25a7189d260fedad09c0d614352a064e4e81c52065137ac3f845134c30" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.003652 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data\") pod \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.003879 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs\") pod \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.003979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtg8l\" (UniqueName: \"kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l\") pod \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.004039 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs\") pod \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.004080 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle\") pod \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\" (UID: \"1b8d0a7a-6384-400f-91b8-3bf1cfb785a8\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.034213 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l" (OuterVolumeSpecName: "kube-api-access-xtg8l") pod "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" (UID: "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8"). InnerVolumeSpecName "kube-api-access-xtg8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.078014 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.079048 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data" (OuterVolumeSpecName: "config-data") pod "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" (UID: "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.096371 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": dial tcp 10.217.0.206:8775: connect: connection refused" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.097001 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": dial tcp 10.217.0.206:8775: connect: connection refused" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.106498 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtg8l\" (UniqueName: \"kubernetes.io/projected/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-kube-api-access-xtg8l\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.106533 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.110088 4810 scope.go:117] "RemoveContainer" containerID="00634c712dc0748b4e08ff0a87638caa3ab48eec323f9b5a7efe244d9c29c7f0" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.122222 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.175956 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" (UID: "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209025 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209544 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghzgt\" (UniqueName: \"kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt\") pod \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209575 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209724 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209784 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle\") pod \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.209865 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs\") pod \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data\") pod \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q7lr\" (UniqueName: \"kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210295 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210332 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210379 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom\") pod \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\" (UID: \"7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts\") pod \"5d666cf5-d522-45c6-8fbf-13da1289394a\" (UID: \"5d666cf5-d522-45c6-8fbf-13da1289394a\") " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.210270 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.215063 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.215152 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.216812 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.216837 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.216847 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5d666cf5-d522-45c6-8fbf-13da1289394a-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.216859 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.222181 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs" (OuterVolumeSpecName: "logs") pod "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" (UID: "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.222430 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.222781 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt" (OuterVolumeSpecName: "kube-api-access-ghzgt") pod "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" (UID: "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef"). InnerVolumeSpecName "kube-api-access-ghzgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.234448 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr" (OuterVolumeSpecName: "kube-api-access-8q7lr") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "kube-api-access-8q7lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.235658 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets" (OuterVolumeSpecName: "secrets") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.238402 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" (UID: "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.275250 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318638 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318681 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318694 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghzgt\" (UniqueName: \"kubernetes.io/projected/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-kube-api-access-ghzgt\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318707 4810 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318717 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318727 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q7lr\" (UniqueName: \"kubernetes.io/projected/5d666cf5-d522-45c6-8fbf-13da1289394a-kube-api-access-8q7lr\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.318738 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5d666cf5-d522-45c6-8fbf-13da1289394a-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.397071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.400463 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" (UID: "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.410837 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.426399 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.426441 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.426455 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.438880 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" (UID: "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.467614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data" (OuterVolumeSpecName: "config-data") pod "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" (UID: "7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.525028 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" (UID: "1b8d0a7a-6384-400f-91b8-3bf1cfb785a8"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.528717 4810 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.528746 4810 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.528755 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.534795 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "5d666cf5-d522-45c6-8fbf-13da1289394a" (UID: "5d666cf5-d522-45c6-8fbf-13da1289394a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.601117 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi2bc5-account-delete-8p8mv" event={"ID":"5c723fc6-aae6-4032-afba-622dd5ea4dbc","Type":"ContainerStarted","Data":"9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.601294 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi2bc5-account-delete-8p8mv" podUID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" containerName="mariadb-account-delete" containerID="cri-o://9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9" gracePeriod=30 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.621017 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi2bc5-account-delete-8p8mv" podStartSLOduration=5.620997035 podStartE2EDuration="5.620997035s" podCreationTimestamp="2025-10-03 07:20:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 07:20:08.620569283 +0000 UTC m=+1442.047820028" watchObservedRunningTime="2025-10-03 07:20:08.620997035 +0000 UTC m=+1442.048247770" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.624652 4810 generic.go:334] "Generic (PLEG): container finished" podID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerID="61317c73aecc47dae6a6843a4447305cfab5916eb86c694594f67fbaea6a141f" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.624756 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerDied","Data":"61317c73aecc47dae6a6843a4447305cfab5916eb86c694594f67fbaea6a141f"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.624792 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68aa80a4-2dc5-4bf9-a950-6a612c2e182b","Type":"ContainerDied","Data":"06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.624807 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06da84353a0a9d4625f55ee54d4e3d49da7f0d113b3eee1685292aec2a38361c" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.632479 4810 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d666cf5-d522-45c6-8fbf-13da1289394a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.635816 4810 generic.go:334] "Generic (PLEG): container finished" podID="3e68085f-2b80-48bd-a241-70e75780e41e" containerID="1f17eff80bda95e4a3a828da788b685a51858f514dff3cf24b3bd09f553d4640" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.635888 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerDied","Data":"1f17eff80bda95e4a3a828da788b685a51858f514dff3cf24b3bd09f553d4640"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.635934 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75d48cc7d6-2knhz" event={"ID":"3e68085f-2b80-48bd-a241-70e75780e41e","Type":"ContainerDied","Data":"3450f95d00e832442fafa3eaabe56f1ca4f5b15494c63b505249f8f1765d1adc"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.635947 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3450f95d00e832442fafa3eaabe56f1ca4f5b15494c63b505249f8f1765d1adc" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.645407 4810 generic.go:334] "Generic (PLEG): container finished" podID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerID="04a5b24413e8a224bab7a8f72e4facc8f0a0f5ecaf88b24ac5a41f3d8c47c173" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.645491 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerDied","Data":"04a5b24413e8a224bab7a8f72e4facc8f0a0f5ecaf88b24ac5a41f3d8c47c173"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.648088 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" event={"ID":"1d279481-fd9c-4c0f-b7d6-ce457c1a697e","Type":"ContainerDied","Data":"4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.648130 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f178fcc69558ca14c3604fd92a4ade4c5328a396508fe717c6717759c6674b9" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.654122 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0121-account-delete-lrqrh" event={"ID":"20859507-5b44-49b8-96f0-501d89f511c9","Type":"ContainerDied","Data":"57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.654163 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b4053a1b545874e8aadefe3480d893b37e07f061018aed1e1f48ce956d02aa" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.658613 4810 generic.go:334] "Generic (PLEG): container finished" podID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerID="150fc8415216347ff4c833f2ebf152af21b61c3aaf0e38388292d296069c03af" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.658682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerDied","Data":"150fc8415216347ff4c833f2ebf152af21b61c3aaf0e38388292d296069c03af"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.665340 4810 generic.go:334] "Generic (PLEG): container finished" podID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerID="95b505fefa5eb2055f79509fe6a51262d5e6c2ddec8bdf8c70e3b8cacb9557c6" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.665398 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerDied","Data":"95b505fefa5eb2055f79509fe6a51262d5e6c2ddec8bdf8c70e3b8cacb9557c6"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.665473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7","Type":"ContainerDied","Data":"1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.665492 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1236917410ba7b3dde13f53bbae4ef8e234e6fd75465c089c4ae0ab2ed14e082" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.675120 4810 generic.go:334] "Generic (PLEG): container finished" podID="b493c984-b8a7-4a78-9210-fc44ecf70eec" containerID="c2ce06d888aab36e7467034584a2c26ca4a4529f3aa58301133aa498bf2388a0" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.675183 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron073c-account-delete-9hj8g" event={"ID":"b493c984-b8a7-4a78-9210-fc44ecf70eec","Type":"ContainerDied","Data":"c2ce06d888aab36e7467034584a2c26ca4a4529f3aa58301133aa498bf2388a0"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.689741 4810 generic.go:334] "Generic (PLEG): container finished" podID="28fa9089-99f4-4d50-8fe8-5f4492fe3767" containerID="406b0014a3e09b53049dab7e73d10bcd487a07a0390eb2d728848d7df1f67cc2" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.689820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder8097-account-delete-nn9p8" event={"ID":"28fa9089-99f4-4d50-8fe8-5f4492fe3767","Type":"ContainerDied","Data":"406b0014a3e09b53049dab7e73d10bcd487a07a0390eb2d728848d7df1f67cc2"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.704691 4810 generic.go:334] "Generic (PLEG): container finished" podID="ebbf1a25-79f6-42f3-af72-ff0e33094ecd" containerID="afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.704776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementc289-account-delete-v9bc4" event={"ID":"ebbf1a25-79f6-42f3-af72-ff0e33094ecd","Type":"ContainerDied","Data":"afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.752905 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-766c6867bf-vjpbj" event={"ID":"e0b20ced-eb2f-41a3-8988-611615af5759","Type":"ContainerDied","Data":"0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.752947 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0161aedafc1a078c4215b119c3a46dabbb874efbce8b537649b6c8264672b973" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.768350 4810 generic.go:334] "Generic (PLEG): container finished" podID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerID="ba367086c2d5f7cae40c427ef1243c3b1029c3f76d37b715b51d6a5c5d14c6c7" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.768441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerDied","Data":"ba367086c2d5f7cae40c427ef1243c3b1029c3f76d37b715b51d6a5c5d14c6c7"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.788318 4810 generic.go:334] "Generic (PLEG): container finished" podID="1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" containerID="b1217ef0c832fd42381708528dc2544dfbb04164aeff3f282d577c64071f2c38" exitCode=0 Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.788454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell03325-account-delete-sqg6n" event={"ID":"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa","Type":"ContainerDied","Data":"b1217ef0c832fd42381708528dc2544dfbb04164aeff3f282d577c64071f2c38"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.830703 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.835391 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7a72f53c-a634-4457-9cab-3a4cbea90eae","Type":"ContainerDied","Data":"946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b"} Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.835445 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="946d365e201d8880adee1dfd20e3ea149150a028a9e698e0f06efcf3b4698c2b" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.835530 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-75dfb4687f-c6pvn" Oct 03 07:20:08 crc kubenswrapper[4810]: I1003 07:20:08.842562 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.044257 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.044774 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="sg-core" containerID="cri-o://da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.044982 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="proxy-httpd" containerID="cri-o://2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.045038 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-notification-agent" containerID="cri-o://6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.045533 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-central-agent" containerID="cri-o://23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.095297 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.095927 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="eb6eb4ae-ef62-480d-80a9-411be811155b" containerName="kube-state-metrics" containerID="cri-o://2453516ab32d8b1790716264b607d141d5a90ef1a6cefa4be573a9c5999f1aee" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.149244 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.201127 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.255692 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom\") pod \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.255774 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data\") pod \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.255830 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf6zs\" (UniqueName: \"kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs\") pod \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.255862 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle\") pod \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.255977 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs\") pod \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\" (UID: \"1d279481-fd9c-4c0f-b7d6-ce457c1a697e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.259604 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs" (OuterVolumeSpecName: "logs") pod "1d279481-fd9c-4c0f-b7d6-ce457c1a697e" (UID: "1d279481-fd9c-4c0f-b7d6-ce457c1a697e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.260972 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59fc6f8ff7-45579"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.287371 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs" (OuterVolumeSpecName: "kube-api-access-vf6zs") pod "1d279481-fd9c-4c0f-b7d6-ce457c1a697e" (UID: "1d279481-fd9c-4c0f-b7d6-ce457c1a697e"). InnerVolumeSpecName "kube-api-access-vf6zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.347312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d279481-fd9c-4c0f-b7d6-ce457c1a697e" (UID: "1d279481-fd9c-4c0f-b7d6-ce457c1a697e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.361706 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.362014 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.362074 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf6zs\" (UniqueName: \"kubernetes.io/projected/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-kube-api-access-vf6zs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.362086 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.362143 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.362260 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.379204 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.381347 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.404766 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.404849 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerName="nova-cell1-conductor-conductor" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.424053 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.424106 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.456028 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" path="/var/lib/kubelet/pods/494d64ab-9bc7-4bc6-8cc4-1085679458bb/volumes" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.457581 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d279481-fd9c-4c0f-b7d6-ce457c1a697e" (UID: "1d279481-fd9c-4c0f-b7d6-ce457c1a697e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.463036 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465268 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465493 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465782 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87dgt\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465906 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.465979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.466091 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs\") pod \"e0b20ced-eb2f-41a3-8988-611615af5759\" (UID: \"e0b20ced-eb2f-41a3-8988-611615af5759\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.473606 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.475067 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.481400 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.489171 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt" (OuterVolumeSpecName: "kube-api-access-87dgt") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "kube-api-access-87dgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.501117 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.507668 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.507713 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.507731 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.507749 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.507760 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7rltw"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.508064 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" containerName="memcached" containerID="cri-o://68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.516224 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fhzl5"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.521637 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.529956 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.552123 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.563270 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576619 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576694 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576722 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wg8s\" (UniqueName: \"kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s\") pod \"20859507-5b44-49b8-96f0-501d89f511c9\" (UID: \"20859507-5b44-49b8-96f0-501d89f511c9\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576865 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxbv7\" (UniqueName: \"kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.576921 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle\") pod \"7a72f53c-a634-4457-9cab-3a4cbea90eae\" (UID: \"7a72f53c-a634-4457-9cab-3a4cbea90eae\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.577283 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.577301 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87dgt\" (UniqueName: \"kubernetes.io/projected/e0b20ced-eb2f-41a3-8988-611615af5759-kube-api-access-87dgt\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.577311 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.577320 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0b20ced-eb2f-41a3-8988-611615af5759-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.591004 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.601259 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-59s9b"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.614589 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.615543 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.615738 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-59s9b"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.623706 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fhzl5"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.654310 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystonef3ef-account-delete-ppgzd"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.654920 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20859507-5b44-49b8-96f0-501d89f511c9" containerName="mariadb-account-delete" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656216 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="20859507-5b44-49b8-96f0-501d89f511c9" containerName="mariadb-account-delete" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656238 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="ovsdbserver-sb" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656244 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="ovsdbserver-sb" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656257 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656263 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker-log" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656276 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656282 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656294 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656303 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api-log" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656311 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656317 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656325 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="ovsdbserver-nb" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656332 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="ovsdbserver-nb" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656350 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="mysql-bootstrap" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656355 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="mysql-bootstrap" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656363 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-api" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656369 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-api" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656382 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="cinder-scheduler" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656387 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="cinder-scheduler" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656393 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="init" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656399 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="init" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656411 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656417 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656423 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656429 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656438 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-httpd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656443 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-httpd" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656452 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656458 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656464 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="probe" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656470 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="probe" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656482 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656488 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656498 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656504 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656514 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656519 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656530 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-server" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656536 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-server" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656547 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656552 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-log" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656565 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656571 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener-log" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656583 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="galera" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656588 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="galera" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656597 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656602 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656616 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="dnsmasq-dns" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656622 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="dnsmasq-dns" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.656636 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.656642 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658180 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="20859507-5b44-49b8-96f0-501d89f511c9" containerName="mariadb-account-delete" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658201 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658209 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658218 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2b9513-b087-4498-aeef-d912e22091fb" containerName="ovn-controller" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658230 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-api" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658241 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="494d64ab-9bc7-4bc6-8cc4-1085679458bb" containerName="dnsmasq-dns" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658248 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658256 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658264 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" containerName="placement-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658273 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658284 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-server" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658297 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" containerName="galera" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658304 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" containerName="barbican-keystone-listener" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658311 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" containerName="cinder-api" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658321 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="cinder-scheduler" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658329 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="openstack-network-exporter" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658339 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" containerName="ovsdbserver-nb" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658347 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" containerName="ovsdbserver-sb" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658354 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658364 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" containerName="nova-metadata-metadata" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658371 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" containerName="probe" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658381 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" containerName="proxy-httpd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658389 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" containerName="barbican-worker-log" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658395 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658821 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.658922 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.667683 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.673679 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.674067 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5b9f9f6fd8-z8ls6" podUID="a856067b-dba0-4860-93c3-ff4650760a4b" containerName="keystone-api" containerID="cri-o://accc7eedd538c9a716310f485205e12a3004a700295626a4fb4494a1c3cf497b" gracePeriod=30 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.676299 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-hdwhb"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680540 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680616 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680654 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680684 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680748 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs\") pod \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680786 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680832 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680864 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data\") pod \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680956 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.680992 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6pr\" (UniqueName: \"kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681023 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681052 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681093 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681145 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681186 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs\") pod \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffvrh\" (UniqueName: \"kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh\") pod \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681306 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d656\" (UniqueName: \"kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681334 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle\") pod \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\" (UID: \"3c746de5-0009-4522-8e8b-f7d7e0ca5fe7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681359 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs\") pod \"3e68085f-2b80-48bd-a241-70e75780e41e\" (UID: \"3e68085f-2b80-48bd-a241-70e75780e41e\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681394 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs\") pod \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\" (UID: \"68aa80a4-2dc5-4bf9-a950-6a612c2e182b\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.681634 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7" (OuterVolumeSpecName: "kube-api-access-dxbv7") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "kube-api-access-dxbv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.682053 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.682069 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7a72f53c-a634-4457-9cab-3a4cbea90eae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.682078 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxbv7\" (UniqueName: \"kubernetes.io/projected/7a72f53c-a634-4457-9cab-3a4cbea90eae-kube-api-access-dxbv7\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.682719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs" (OuterVolumeSpecName: "logs") pod "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" (UID: "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.682735 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs" (OuterVolumeSpecName: "logs") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.697445 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s" (OuterVolumeSpecName: "kube-api-access-8wg8s") pod "20859507-5b44-49b8-96f0-501d89f511c9" (UID: "20859507-5b44-49b8-96f0-501d89f511c9"). InnerVolumeSpecName "kube-api-access-8wg8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.698490 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.701002 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.710911 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts" (OuterVolumeSpecName: "scripts") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.712625 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs" (OuterVolumeSpecName: "logs") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.714232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.716116 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr" (OuterVolumeSpecName: "kube-api-access-2k6pr") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "kube-api-access-2k6pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.716564 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts" (OuterVolumeSpecName: "scripts") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.718150 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.718385 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.718497 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.721335 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.721404 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.722827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh" (OuterVolumeSpecName: "kube-api-access-ffvrh") pod "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" (UID: "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7"). InnerVolumeSpecName "kube-api-access-ffvrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.723798 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.727768 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.727765 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts" (OuterVolumeSpecName: "scripts") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.727837 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.731045 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656" (OuterVolumeSpecName: "kube-api-access-7d656") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "kube-api-access-7d656". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.733190 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.733241 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.733607 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.733628 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.743972 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.789824 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.789879 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790051 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790116 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmc8\" (UniqueName: \"kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790175 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790203 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlpv6\" (UniqueName: \"kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790680 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790758 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790814 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.790939 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.793042 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.807045 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data\") pod \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\" (UID: \"2cde0e0d-aa4d-45e1-bac8-52225a6c997c\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.807254 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs\") pod \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\" (UID: \"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.807364 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgs2q\" (UniqueName: \"kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q\") pod \"28fa9089-99f4-4d50-8fe8-5f4492fe3767\" (UID: \"28fa9089-99f4-4d50-8fe8-5f4492fe3767\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.807628 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljrxl\" (UniqueName: \"kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl\") pod \"keystonef3ef-account-delete-ppgzd\" (UID: \"a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e\") " pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.808065 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.812101 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.812735 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.813089 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.813413 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6pr\" (UniqueName: \"kubernetes.io/projected/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-kube-api-access-2k6pr\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.814041 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e68085f-2b80-48bd-a241-70e75780e41e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.814460 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wg8s\" (UniqueName: \"kubernetes.io/projected/20859507-5b44-49b8-96f0-501d89f511c9-kube-api-access-8wg8s\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.814794 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.815160 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.815505 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffvrh\" (UniqueName: \"kubernetes.io/projected/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-kube-api-access-ffvrh\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.816042 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d656\" (UniqueName: \"kubernetes.io/projected/3e68085f-2b80-48bd-a241-70e75780e41e-kube-api-access-7d656\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.816370 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.805822 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs" (OuterVolumeSpecName: "logs") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.806291 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs" (OuterVolumeSpecName: "logs") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.838412 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.839137 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.845060 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.845732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6" (OuterVolumeSpecName: "kube-api-access-dlpv6") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "kube-api-access-dlpv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.848678 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystonef3ef-account-delete-ppgzd"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.849060 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8" (OuterVolumeSpecName: "kube-api-access-5xmc8") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "kube-api-access-5xmc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.849150 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data" (OuterVolumeSpecName: "config-data") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.850862 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.851111 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.858350 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementc289-account-delete-v9bc4" event={"ID":"ebbf1a25-79f6-42f3-af72-ff0e33094ecd","Type":"ContainerDied","Data":"bc190f56838f6c3c55e6bb6b56769d1b19120c7d476eb7b9d11ea647daed1e3f"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.858399 4810 scope.go:117] "RemoveContainer" containerID="afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.858519 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementc289-account-delete-v9bc4" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.862643 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mnstl"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.863047 4810 generic.go:334] "Generic (PLEG): container finished" podID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" containerID="9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9" exitCode=0 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.863093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi2bc5-account-delete-8p8mv" event={"ID":"5c723fc6-aae6-4032-afba-622dd5ea4dbc","Type":"ContainerDied","Data":"9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.871601 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mnstl"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.872753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-db7b99d8d-9txpw" event={"ID":"2cde0e0d-aa4d-45e1-bac8-52225a6c997c","Type":"ContainerDied","Data":"bab0144eacbf846c30a39d5010d0dd6c1fc6438f1979f0023937f3978740812f"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.872860 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-db7b99d8d-9txpw" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.876640 4810 generic.go:334] "Generic (PLEG): container finished" podID="eb6eb4ae-ef62-480d-80a9-411be811155b" containerID="2453516ab32d8b1790716264b607d141d5a90ef1a6cefa4be573a9c5999f1aee" exitCode=2 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.876818 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb6eb4ae-ef62-480d-80a9-411be811155b","Type":"ContainerDied","Data":"2453516ab32d8b1790716264b607d141d5a90ef1a6cefa4be573a9c5999f1aee"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.876853 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.878450 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron073c-account-delete-9hj8g" event={"ID":"b493c984-b8a7-4a78-9210-fc44ecf70eec","Type":"ContainerDied","Data":"2d9b2d7c30b0760c100de9cf8f24641b9104eec90f95585773f024e21b1479e7"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.878514 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron073c-account-delete-9hj8g" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.880838 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3b7c316c-a9f6-4fa2-b150-83d6b65a9d25","Type":"ContainerDied","Data":"8d373eced8c707de1a18288fed92bfc19b35ff8b730a2b74971d53b68a3d16be"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.880909 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.881877 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-75dfb4687f-c6pvn"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.899355 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.915955 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerID="2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349" exitCode=0 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.915984 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerID="da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba" exitCode=2 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.916069 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerDied","Data":"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.916096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerDied","Data":"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.921783 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a78c823-e408-41d7-9f0f-3a89b648dae7","Type":"ContainerDied","Data":"81a32e631a4f071d87a3476dbadbd1e71bead6de90d3816abfce65d4a3429596"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.921923 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.927665 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonef3ef-account-delete-ppgzd"] Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.929455 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ljrxl], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystonef3ef-account-delete-ppgzd" podUID="a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.933561 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f3ef-account-create-mwtr7"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936191 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8r4x\" (UniqueName: \"kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x\") pod \"b493c984-b8a7-4a78-9210-fc44ecf70eec\" (UID: \"b493c984-b8a7-4a78-9210-fc44ecf70eec\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936300 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936397 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936415 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936442 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936495 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936512 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936589 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hxl6\" (UniqueName: \"kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6\") pod \"ebbf1a25-79f6-42f3-af72-ff0e33094ecd\" (UID: \"ebbf1a25-79f6-42f3-af72-ff0e33094ecd\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936636 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.936671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64m2h\" (UniqueName: \"kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h\") pod \"6a78c823-e408-41d7-9f0f-3a89b648dae7\" (UID: \"6a78c823-e408-41d7-9f0f-3a89b648dae7\") " Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.939531 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs" (OuterVolumeSpecName: "logs") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.942650 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q" (OuterVolumeSpecName: "kube-api-access-rgs2q") pod "28fa9089-99f4-4d50-8fe8-5f4492fe3767" (UID: "28fa9089-99f4-4d50-8fe8-5f4492fe3767"). InnerVolumeSpecName "kube-api-access-rgs2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.943241 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.951650 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x" (OuterVolumeSpecName: "kube-api-access-g8r4x") pod "b493c984-b8a7-4a78-9210-fc44ecf70eec" (UID: "b493c984-b8a7-4a78-9210-fc44ecf70eec"). InnerVolumeSpecName "kube-api-access-g8r4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.954694 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f3ef-account-create-mwtr7"] Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.962322 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrxl\" (UniqueName: \"kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl\") pod \"keystonef3ef-account-delete-ppgzd\" (UID: \"a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e\") " pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.963533 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerDied","Data":"49fef7a7d668fc557a8e17d62bce3a1fd7aea690b4d9b4a867e5da5faeec8731"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.963692 4810 generic.go:334] "Generic (PLEG): container finished" podID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerID="49fef7a7d668fc557a8e17d62bce3a1fd7aea690b4d9b4a867e5da5faeec8731" exitCode=0 Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.967815 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data" (OuterVolumeSpecName: "config-data") pod "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" (UID: "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975307 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975336 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8r4x\" (UniqueName: \"kubernetes.io/projected/b493c984-b8a7-4a78-9210-fc44ecf70eec-kube-api-access-g8r4x\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975363 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975372 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgs2q\" (UniqueName: \"kubernetes.io/projected/28fa9089-99f4-4d50-8fe8-5f4492fe3767-kube-api-access-rgs2q\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975387 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975396 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975405 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmc8\" (UniqueName: \"kubernetes.io/projected/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-kube-api-access-5xmc8\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975415 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975428 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlpv6\" (UniqueName: \"kubernetes.io/projected/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-kube-api-access-dlpv6\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975437 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975445 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975455 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a78c823-e408-41d7-9f0f-3a89b648dae7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.975755 4810 scope.go:117] "RemoveContainer" containerID="150fc8415216347ff4c833f2ebf152af21b61c3aaf0e38388292d296069c03af" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976081 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder8097-account-delete-nn9p8" event={"ID":"28fa9089-99f4-4d50-8fe8-5f4492fe3767","Type":"ContainerDied","Data":"db6e0d4f23e047950821868b65b55faa7b46962691cf8c1ad44e08922a8fbcf1"} Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976364 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976407 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder8097-account-delete-nn9p8" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.976425 4810 projected.go:194] Error preparing data for projected volume kube-api-access-ljrxl for pod openstack/keystonef3ef-account-delete-ppgzd: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976445 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6887c7bd68-bxt6s" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976526 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-766c6867bf-vjpbj" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976567 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.976489 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75d48cc7d6-2knhz" Oct 03 07:20:09 crc kubenswrapper[4810]: E1003 07:20:09.976682 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl podName:a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e nodeName:}" failed. No retries permitted until 2025-10-03 07:20:10.476469456 +0000 UTC m=+1443.903720191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ljrxl" (UniqueName: "kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl") pod "keystonef3ef-account-delete-ppgzd" (UID: "a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.980467 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6" (OuterVolumeSpecName: "kube-api-access-6hxl6") pod "ebbf1a25-79f6-42f3-af72-ff0e33094ecd" (UID: "ebbf1a25-79f6-42f3-af72-ff0e33094ecd"). InnerVolumeSpecName "kube-api-access-6hxl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.980832 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0121-account-delete-lrqrh" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.991699 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts" (OuterVolumeSpecName: "scripts") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:09 crc kubenswrapper[4810]: I1003 07:20:09.992765 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h" (OuterVolumeSpecName: "kube-api-access-64m2h") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "kube-api-access-64m2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.015197 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.078420 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hxl6\" (UniqueName: \"kubernetes.io/projected/ebbf1a25-79f6-42f3-af72-ff0e33094ecd-kube-api-access-6hxl6\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.078445 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64m2h\" (UniqueName: \"kubernetes.io/projected/6a78c823-e408-41d7-9f0f-3a89b648dae7-kube-api-access-64m2h\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.078487 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.079964 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.082747 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data" (OuterVolumeSpecName: "config-data") pod "1d279481-fd9c-4c0f-b7d6-ce457c1a697e" (UID: "1d279481-fd9c-4c0f-b7d6-ce457c1a697e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.107459 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.107619 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.166991 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.171843 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.185748 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.185882 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerName="nova-scheduler-scheduler" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.185966 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.189397 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.190158 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6d89\" (UniqueName: \"kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89\") pod \"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa\" (UID: \"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.190358 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9jbk\" (UniqueName: \"kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.190443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.191592 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d279481-fd9c-4c0f-b7d6-ce457c1a697e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.192192 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs" (OuterVolumeSpecName: "logs") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.198777 4810 scope.go:117] "RemoveContainer" containerID="ee5aefdafa078a9fbd3fd95199dc8e09c3b1a213b866d27681b0e68a66781605" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.201335 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89" (OuterVolumeSpecName: "kube-api-access-x6d89") pod "1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" (UID: "1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa"). InnerVolumeSpecName "kube-api-access-x6d89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.203012 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.203139 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts" (OuterVolumeSpecName: "scripts") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.221298 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk" (OuterVolumeSpecName: "kube-api-access-w9jbk") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "kube-api-access-w9jbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.228534 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.262394 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.204:3000/\": dial tcp 10.217.0.204:3000: connect: connection refused" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.289350 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="galera" containerID="cri-o://7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9" gracePeriod=30 Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdwh\" (UniqueName: \"kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh\") pod \"5c723fc6-aae6-4032-afba-622dd5ea4dbc\" (UID: \"5c723fc6-aae6-4032-afba-622dd5ea4dbc\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292310 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292332 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292397 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292419 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\" (UID: \"a43a63ef-cea9-48b2-af35-04fe0a365fa5\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292669 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292680 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6d89\" (UniqueName: \"kubernetes.io/projected/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa-kube-api-access-x6d89\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292691 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9jbk\" (UniqueName: \"kubernetes.io/projected/a43a63ef-cea9-48b2-af35-04fe0a365fa5-kube-api-access-w9jbk\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292699 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.292709 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.296120 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.297469 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.301784 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder8097-account-delete-nn9p8"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.306742 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.308377 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.319963 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican0121-account-delete-lrqrh"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.336756 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh" (OuterVolumeSpecName: "kube-api-access-5rdwh") pod "5c723fc6-aae6-4032-afba-622dd5ea4dbc" (UID: "5c723fc6-aae6-4032-afba-622dd5ea4dbc"). InnerVolumeSpecName "kube-api-access-5rdwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.354169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.368825 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.376149 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0b20ced-eb2f-41a3-8988-611615af5759" (UID: "e0b20ced-eb2f-41a3-8988-611615af5759"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.376205 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementc289-account-delete-v9bc4"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399429 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rdwh\" (UniqueName: \"kubernetes.io/projected/5c723fc6-aae6-4032-afba-622dd5ea4dbc-kube-api-access-5rdwh\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399478 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a43a63ef-cea9-48b2-af35-04fe0a365fa5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399500 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399510 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399519 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.399530 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b20ced-eb2f-41a3-8988-611615af5759-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.419987 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 07:20:10 crc kubenswrapper[4810]: W1003 07:20:10.430105 4810 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c723fc6_aae6_4032_afba_622dd5ea4dbc.slice/crio-9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c723fc6_aae6_4032_afba_622dd5ea4dbc.slice/crio-9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9.scope: no such file or directory Oct 03 07:20:10 crc kubenswrapper[4810]: W1003 07:20:10.437540 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebbf1a25_79f6_42f3_af72_ff0e33094ecd.slice/crio-afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a.scope WatchSource:0}: Error finding container afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a: Status 404 returned error can't find the container with id afaa16c9fa685a6a368f75c3e67dbd569b00ebdee1a4f81f32be5e8d85f47b7a Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.454293 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.455819 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data" (OuterVolumeSpecName: "config-data") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.490300 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" (UID: "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.501046 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrxl\" (UniqueName: \"kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl\") pod \"keystonef3ef-account-delete-ppgzd\" (UID: \"a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e\") " pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.501141 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.501157 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.501166 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.501175 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.502213 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.502915 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data" (OuterVolumeSpecName: "config-data") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.504729 4810 projected.go:194] Error preparing data for projected volume kube-api-access-ljrxl for pod openstack/keystonef3ef-account-delete-ppgzd: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.504809 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl podName:a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e nodeName:}" failed. No retries permitted until 2025-10-03 07:20:11.504786091 +0000 UTC m=+1444.932036906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljrxl" (UniqueName: "kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl") pod "keystonef3ef-account-delete-ppgzd" (UID: "a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.517667 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.559763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" (UID: "3c746de5-0009-4522-8e8b-f7d7e0ca5fe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.559759 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.562297 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.589351 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data" (OuterVolumeSpecName: "config-data") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.593394 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602422 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602455 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602469 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602481 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602494 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602505 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602517 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.602528 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.625775 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.626107 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.642454 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68aa80a4-2dc5-4bf9-a950-6a612c2e182b" (UID: "68aa80a4-2dc5-4bf9-a950-6a612c2e182b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.651279 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data" (OuterVolumeSpecName: "config-data") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.675876 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data" (OuterVolumeSpecName: "config-data") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.706222 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.706635 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.706694 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68aa80a4-2dc5-4bf9-a950-6a612c2e182b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.706819 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.710431 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.707364 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.713313 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data podName:94a52a94-56b0-4dc9-9804-020a890b1fff nodeName:}" failed. No retries permitted until 2025-10-03 07:20:18.713286444 +0000 UTC m=+1452.140537179 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data") pod "rabbitmq-server-0" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff") : configmap "rabbitmq-config-data" not found Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.709108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.713353 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.720608 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron073c-account-delete-9hj8g"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.726784 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.733188 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.736521 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data" (OuterVolumeSpecName: "config-data") pod "7a72f53c-a634-4457-9cab-3a4cbea90eae" (UID: "7a72f53c-a634-4457-9cab-3a4cbea90eae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.744760 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.749108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.751012 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.753432 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data" (OuterVolumeSpecName: "config-data") pod "6a78c823-e408-41d7-9f0f-3a89b648dae7" (UID: "6a78c823-e408-41d7-9f0f-3a89b648dae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.756783 4810 scope.go:117] "RemoveContainer" containerID="c2ce06d888aab36e7467034584a2c26ca4a4529f3aa58301133aa498bf2388a0" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.783261 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" (UID: "3b7c316c-a9f6-4fa2-b150-83d6b65a9d25"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.784974 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.789805 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2cde0e0d-aa4d-45e1-bac8-52225a6c997c" (UID: "2cde0e0d-aa4d-45e1-bac8-52225a6c997c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.794027 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-766c6867bf-vjpbj"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.799039 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.802523 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a43a63ef-cea9-48b2-af35-04fe0a365fa5" (UID: "a43a63ef-cea9-48b2-af35-04fe0a365fa5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.804260 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6887c7bd68-bxt6s"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.810356 4810 scope.go:117] "RemoveContainer" containerID="04a5b24413e8a224bab7a8f72e4facc8f0a0f5ecaf88b24ac5a41f3d8c47c173" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818214 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config\") pod \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818278 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdxqr\" (UniqueName: \"kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr\") pod \"eb6eb4ae-ef62-480d-80a9-411be811155b\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config\") pod \"eb6eb4ae-ef62-480d-80a9-411be811155b\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818361 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs\") pod \"eb6eb4ae-ef62-480d-80a9-411be811155b\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818385 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle\") pod \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818424 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs\") pod \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818464 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvp6v\" (UniqueName: \"kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v\") pod \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818568 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data\") pod \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\" (UID: \"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.818608 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle\") pod \"eb6eb4ae-ef62-480d-80a9-411be811155b\" (UID: \"eb6eb4ae-ef62-480d-80a9-411be811155b\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819457 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a72f53c-a634-4457-9cab-3a4cbea90eae-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819476 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819486 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819494 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cde0e0d-aa4d-45e1-bac8-52225a6c997c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819503 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819512 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a78c823-e408-41d7-9f0f-3a89b648dae7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819521 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43a63ef-cea9-48b2-af35-04fe0a365fa5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.819854 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.820935 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" (UID: "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.828528 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr" (OuterVolumeSpecName: "kube-api-access-bdxqr") pod "eb6eb4ae-ef62-480d-80a9-411be811155b" (UID: "eb6eb4ae-ef62-480d-80a9-411be811155b"). InnerVolumeSpecName "kube-api-access-bdxqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.829022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data" (OuterVolumeSpecName: "config-data") pod "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" (UID: "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.833751 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.835591 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v" (OuterVolumeSpecName: "kube-api-access-gvp6v") pod "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" (UID: "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1"). InnerVolumeSpecName "kube-api-access-gvp6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.845107 4810 scope.go:117] "RemoveContainer" containerID="d2effcc3ba0d8b3fd037a8a2e4fb2fa5cfd1c2d632b8be6e535aa804e226e280" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.857964 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" (UID: "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.858859 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb6eb4ae-ef62-480d-80a9-411be811155b" (UID: "eb6eb4ae-ef62-480d-80a9-411be811155b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.868331 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e68085f-2b80-48bd-a241-70e75780e41e" (UID: "3e68085f-2b80-48bd-a241-70e75780e41e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.871015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" (UID: "6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.876161 4810 scope.go:117] "RemoveContainer" containerID="ba367086c2d5f7cae40c427ef1243c3b1029c3f76d37b715b51d6a5c5d14c6c7" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.898147 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "eb6eb4ae-ef62-480d-80a9-411be811155b" (UID: "eb6eb4ae-ef62-480d-80a9-411be811155b"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.920744 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data\") pod \"42b07401-8647-4f71-a3a8-ef7e2e72e171\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.921219 4810 scope.go:117] "RemoveContainer" containerID="61b1ebe508d4c4ad6781c304ef58a02e24a10dd6ba9c40650db17b0f431011ec" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.921712 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle\") pod \"42b07401-8647-4f71-a3a8-ef7e2e72e171\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.921762 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldm4l\" (UniqueName: \"kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l\") pod \"42b07401-8647-4f71-a3a8-ef7e2e72e171\" (UID: \"42b07401-8647-4f71-a3a8-ef7e2e72e171\") " Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923079 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923505 4810 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923684 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvp6v\" (UniqueName: \"kubernetes.io/projected/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kube-api-access-gvp6v\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923752 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923820 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.923972 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.924058 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.924117 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdxqr\" (UniqueName: \"kubernetes.io/projected/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-api-access-bdxqr\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.924177 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e68085f-2b80-48bd-a241-70e75780e41e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.924241 4810 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.933254 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "eb6eb4ae-ef62-480d-80a9-411be811155b" (UID: "eb6eb4ae-ef62-480d-80a9-411be811155b"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.941018 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.943545 4810 scope.go:117] "RemoveContainer" containerID="406b0014a3e09b53049dab7e73d10bcd487a07a0390eb2d728848d7df1f67cc2" Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.967574 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.979746 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.981534 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:10 crc kubenswrapper[4810]: I1003 07:20:10.984559 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.993960 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.995546 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 07:20:10 crc kubenswrapper[4810]: E1003 07:20:10.995637 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" containerName="nova-cell0-conductor-conductor" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.002339 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data" (OuterVolumeSpecName: "config-data") pod "42b07401-8647-4f71-a3a8-ef7e2e72e171" (UID: "42b07401-8647-4f71-a3a8-ef7e2e72e171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.010356 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.010588 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eb6eb4ae-ef62-480d-80a9-411be811155b","Type":"ContainerDied","Data":"46d3634a1200e9e509db8afd33ab98d129832c208ae10bc6f67192a5b9adc6e0"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.014498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l" (OuterVolumeSpecName: "kube-api-access-ldm4l") pod "42b07401-8647-4f71-a3a8-ef7e2e72e171" (UID: "42b07401-8647-4f71-a3a8-ef7e2e72e171"). InnerVolumeSpecName "kube-api-access-ldm4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.022487 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.025792 4810 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb6eb4ae-ef62-480d-80a9-411be811155b-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.025825 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.025841 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldm4l\" (UniqueName: \"kubernetes.io/projected/42b07401-8647-4f71-a3a8-ef7e2e72e171-kube-api-access-ldm4l\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.032950 4810 scope.go:117] "RemoveContainer" containerID="2453516ab32d8b1790716264b607d141d5a90ef1a6cefa4be573a9c5999f1aee" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.033281 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42b07401-8647-4f71-a3a8-ef7e2e72e171" (UID: "42b07401-8647-4f71-a3a8-ef7e2e72e171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.034304 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell03325-account-delete-sqg6n" event={"ID":"1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa","Type":"ContainerDied","Data":"d9acb09122787a0f17af4564d93b3d0b7df17919e76fea1f1bc9b0f1e57ba686"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.034394 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell03325-account-delete-sqg6n" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.035538 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-75d48cc7d6-2knhz"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.042730 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.044520 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi2bc5-account-delete-8p8mv" event={"ID":"5c723fc6-aae6-4032-afba-622dd5ea4dbc","Type":"ContainerDied","Data":"b8fd1274c7bb002b7eed16cc22bdd6be3e2781e36e837889d4aa6576954dd4ab"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.044670 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi2bc5-account-delete-8p8mv" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.047283 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.064752 4810 scope.go:117] "RemoveContainer" containerID="b1217ef0c832fd42381708528dc2544dfbb04164aeff3f282d577c64071f2c38" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.071224 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.081215 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.083495 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a43a63ef-cea9-48b2-af35-04fe0a365fa5","Type":"ContainerDied","Data":"d56da9519fd119b5258df6c886b3770dc7f003fbe71251b3bdf7a752ed0c8f5d"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.083637 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.102724 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.107979 4810 generic.go:334] "Generic (PLEG): container finished" podID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" exitCode=0 Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.108061 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42b07401-8647-4f71-a3a8-ef7e2e72e171","Type":"ContainerDied","Data":"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.108090 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"42b07401-8647-4f71-a3a8-ef7e2e72e171","Type":"ContainerDied","Data":"04192173094e28b8ce7c54a6f854f1c13abca1be6c22df10deaeb19c99482b5e"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.108123 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.109177 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell03325-account-delete-sqg6n"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.113564 4810 generic.go:334] "Generic (PLEG): container finished" podID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" containerID="68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e" exitCode=0 Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.113628 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1","Type":"ContainerDied","Data":"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.113682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1","Type":"ContainerDied","Data":"a068a4fc698e7be87ad1d45230c27fbc1f04c95a5ddf4a3a59fa3f2e51df7381"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.113750 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.126328 4810 scope.go:117] "RemoveContainer" containerID="9eb958e7ea55c896923679489dfe8330d61824750b953162f4316b59315e22d9" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.127291 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42b07401-8647-4f71-a3a8-ef7e2e72e171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.134927 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.136437 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerID="23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb" exitCode=0 Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.136566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerDied","Data":"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb"} Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.143214 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi2bc5-account-delete-8p8mv"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.147375 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.147579 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.167276 4810 scope.go:117] "RemoveContainer" containerID="49fef7a7d668fc557a8e17d62bce3a1fd7aea690b4d9b4a867e5da5faeec8731" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.167670 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.169150 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.178818 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.184027 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.192130 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.197124 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.197487 4810 scope.go:117] "RemoveContainer" containerID="09c620dc8fe68cac670d9ae669f099323e4cea23f3ddbfa4a9ad6d121c674e43" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.201784 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.211997 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-db7b99d8d-9txpw"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.225918 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.226009 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.237434 4810 scope.go:117] "RemoveContainer" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.312458 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8d0a7a-6384-400f-91b8-3bf1cfb785a8" path="/var/lib/kubelet/pods/1b8d0a7a-6384-400f-91b8-3bf1cfb785a8/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.313140 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d279481-fd9c-4c0f-b7d6-ce457c1a697e" path="/var/lib/kubelet/pods/1d279481-fd9c-4c0f-b7d6-ce457c1a697e/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.313632 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" path="/var/lib/kubelet/pods/1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.314516 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20859507-5b44-49b8-96f0-501d89f511c9" path="/var/lib/kubelet/pods/20859507-5b44-49b8-96f0-501d89f511c9/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.316690 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fa9089-99f4-4d50-8fe8-5f4492fe3767" path="/var/lib/kubelet/pods/28fa9089-99f4-4d50-8fe8-5f4492fe3767/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.317241 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" path="/var/lib/kubelet/pods/2cde0e0d-aa4d-45e1-bac8-52225a6c997c/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.317798 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" path="/var/lib/kubelet/pods/3b7c316c-a9f6-4fa2-b150-83d6b65a9d25/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.318959 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c746de5-0009-4522-8e8b-f7d7e0ca5fe7" path="/var/lib/kubelet/pods/3c746de5-0009-4522-8e8b-f7d7e0ca5fe7/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.319504 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e68085f-2b80-48bd-a241-70e75780e41e" path="/var/lib/kubelet/pods/3e68085f-2b80-48bd-a241-70e75780e41e/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.320931 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" path="/var/lib/kubelet/pods/42b07401-8647-4f71-a3a8-ef7e2e72e171/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.321439 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46bba23b-757b-4303-a92c-78cb8a956c35" path="/var/lib/kubelet/pods/46bba23b-757b-4303-a92c-78cb8a956c35/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.321879 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48838182-7963-4d2a-a108-13e9a5c7a111" path="/var/lib/kubelet/pods/48838182-7963-4d2a-a108-13e9a5c7a111/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.322742 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" path="/var/lib/kubelet/pods/5c723fc6-aae6-4032-afba-622dd5ea4dbc/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.323310 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d666cf5-d522-45c6-8fbf-13da1289394a" path="/var/lib/kubelet/pods/5d666cf5-d522-45c6-8fbf-13da1289394a/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.324507 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68aa80a4-2dc5-4bf9-a950-6a612c2e182b" path="/var/lib/kubelet/pods/68aa80a4-2dc5-4bf9-a950-6a612c2e182b/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.325399 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" path="/var/lib/kubelet/pods/6a78c823-e408-41d7-9f0f-3a89b648dae7/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.326293 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" path="/var/lib/kubelet/pods/6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.327322 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a72f53c-a634-4457-9cab-3a4cbea90eae" path="/var/lib/kubelet/pods/7a72f53c-a634-4457-9cab-3a4cbea90eae/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.328236 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef" path="/var/lib/kubelet/pods/7a8e2a63-e7f0-4fc0-a6ec-c5151f5474ef/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.328848 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6" path="/var/lib/kubelet/pods/9ffe0ed4-868e-4d44-9d25-7324c0d8b3a6/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.330047 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" path="/var/lib/kubelet/pods/a43a63ef-cea9-48b2-af35-04fe0a365fa5/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.330563 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b493c984-b8a7-4a78-9210-fc44ecf70eec" path="/var/lib/kubelet/pods/b493c984-b8a7-4a78-9210-fc44ecf70eec/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.331313 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1f72b6-eab9-4f83-b5ed-5507c2e79a46" path="/var/lib/kubelet/pods/dd1f72b6-eab9-4f83-b5ed-5507c2e79a46/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.332548 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2b9513-b087-4498-aeef-d912e22091fb" path="/var/lib/kubelet/pods/de2b9513-b087-4498-aeef-d912e22091fb/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.333179 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b20ced-eb2f-41a3-8988-611615af5759" path="/var/lib/kubelet/pods/e0b20ced-eb2f-41a3-8988-611615af5759/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.333875 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16b4d1d-9bc5-4634-a03e-fd823db95e0d" path="/var/lib/kubelet/pods/e16b4d1d-9bc5-4634-a03e-fd823db95e0d/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.335565 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85501ad-bec4-4381-a692-07aa40c0fc57" path="/var/lib/kubelet/pods/e85501ad-bec4-4381-a692-07aa40c0fc57/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.336042 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6eb4ae-ef62-480d-80a9-411be811155b" path="/var/lib/kubelet/pods/eb6eb4ae-ef62-480d-80a9-411be811155b/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.336434 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebbf1a25-79f6-42f3-af72-ff0e33094ecd" path="/var/lib/kubelet/pods/ebbf1a25-79f6-42f3-af72-ff0e33094ecd/volumes" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.340999 4810 scope.go:117] "RemoveContainer" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.341987 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530\": container with ID starting with 8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530 not found: ID does not exist" containerID="8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.342016 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530"} err="failed to get container status \"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530\": rpc error: code = NotFound desc = could not find container \"8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530\": container with ID starting with 8b6b60eb11e6d7c170a13a02d3baf25934e0ae833f748faf4d004a2b682bf530 not found: ID does not exist" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.342039 4810 scope.go:117] "RemoveContainer" containerID="68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.391024 4810 scope.go:117] "RemoveContainer" containerID="68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e" Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.391404 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e\": container with ID starting with 68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e not found: ID does not exist" containerID="68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.391429 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e"} err="failed to get container status \"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e\": rpc error: code = NotFound desc = could not find container \"68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e\": container with ID starting with 68ad82092fa3e80c078309075bc2693d1a2935739f72774338d23b91ce8f5c7e not found: ID does not exist" Oct 03 07:20:11 crc kubenswrapper[4810]: I1003 07:20:11.542604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljrxl\" (UniqueName: \"kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl\") pod \"keystonef3ef-account-delete-ppgzd\" (UID: \"a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e\") " pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.546913 4810 projected.go:194] Error preparing data for projected volume kube-api-access-ljrxl for pod openstack/keystonef3ef-account-delete-ppgzd: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.546984 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl podName:a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e nodeName:}" failed. No retries permitted until 2025-10-03 07:20:13.546963843 +0000 UTC m=+1446.974214578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ljrxl" (UniqueName: "kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl") pod "keystonef3ef-account-delete-ppgzd" (UID: "a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.746093 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:11 crc kubenswrapper[4810]: E1003 07:20:11.746482 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data podName:37cd32da-b730-4e57-a2e0-41bf95ff8ca1 nodeName:}" failed. No retries permitted until 2025-10-03 07:20:19.746467344 +0000 UTC m=+1453.173718069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data") pod "rabbitmq-cell1-server-0" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1") : configmap "rabbitmq-cell1-config-data" not found Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.186871 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc7b4477-ff24-462e-8d3e-99794719fd37/ovn-northd/0.log" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.186952 4810 generic.go:334] "Generic (PLEG): container finished" podID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" exitCode=139 Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.187032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerDied","Data":"ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7"} Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.206504 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystonef3ef-account-delete-ppgzd" Oct 03 07:20:12 crc kubenswrapper[4810]: E1003 07:20:12.369315 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a52a94_56b0_4dc9_9804_020a890b1fff.slice/crio-35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37cd32da_b730_4e57_a2e0_41bf95ff8ca1.slice/crio-conmon-7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37cd32da_b730_4e57_a2e0_41bf95ff8ca1.slice/crio-7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.419732 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystonef3ef-account-delete-ppgzd"] Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.426826 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystonef3ef-account-delete-ppgzd"] Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.461876 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljrxl\" (UniqueName: \"kubernetes.io/projected/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e-kube-api-access-ljrxl\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.613424 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664547 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664615 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664683 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664789 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664824 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664846 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664884 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664930 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.664986 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.665008 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hj27\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27\") pod \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\" (UID: \"37cd32da-b730-4e57-a2e0-41bf95ff8ca1\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.665086 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.665236 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.665381 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.665400 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.666358 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.674482 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27" (OuterVolumeSpecName: "kube-api-access-9hj27") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "kube-api-access-9hj27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.675127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.675312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.675544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info" (OuterVolumeSpecName: "pod-info") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.676028 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.692415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data" (OuterVolumeSpecName: "config-data") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.718229 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf" (OuterVolumeSpecName: "server-conf") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767534 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767586 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767604 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767625 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767644 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767660 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hj27\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-kube-api-access-9hj27\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767704 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.767723 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.781403 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "37cd32da-b730-4e57-a2e0-41bf95ff8ca1" (UID: "37cd32da-b730-4e57-a2e0-41bf95ff8ca1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.785089 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc7b4477-ff24-462e-8d3e-99794719fd37/ovn-northd/0.log" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.785163 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.813452 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.823374 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868304 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868344 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868425 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868448 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868466 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868509 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868590 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868622 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868668 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868684 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868753 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868794 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868816 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868947 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.868987 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw5cw\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw\") pod \"94a52a94-56b0-4dc9-9804-020a890b1fff\" (UID: \"94a52a94-56b0-4dc9-9804-020a890b1fff\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869061 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86j2q\" (UniqueName: \"kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q\") pod \"fc7b4477-ff24-462e-8d3e-99794719fd37\" (UID: \"fc7b4477-ff24-462e-8d3e-99794719fd37\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869077 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869598 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869626 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869640 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.869654 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/37cd32da-b730-4e57-a2e0-41bf95ff8ca1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.871908 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.872350 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.872693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config" (OuterVolumeSpecName: "config") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.874800 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q" (OuterVolumeSpecName: "kube-api-access-86j2q") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "kube-api-access-86j2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.875122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.875314 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info" (OuterVolumeSpecName: "pod-info") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.875652 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.876122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts" (OuterVolumeSpecName: "scripts") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.882380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.886099 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.890023 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw" (OuterVolumeSpecName: "kube-api-access-qw5cw") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "kube-api-access-qw5cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.896534 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data" (OuterVolumeSpecName: "config-data") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.901888 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.916374 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf" (OuterVolumeSpecName: "server-conf") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.941784 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.945832 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "fc7b4477-ff24-462e-8d3e-99794719fd37" (UID: "fc7b4477-ff24-462e-8d3e-99794719fd37"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970611 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970732 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5wd\" (UniqueName: \"kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970766 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970799 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970850 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970927 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.970988 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971048 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts\") pod \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\" (UID: \"064ce5a9-4b27-4851-93ea-aa8c3f038eb8\") " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971385 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971405 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc7b4477-ff24-462e-8d3e-99794719fd37-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971417 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/94a52a94-56b0-4dc9-9804-020a890b1fff-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971438 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971474 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971498 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971514 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw5cw\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-kube-api-access-qw5cw\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971527 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86j2q\" (UniqueName: \"kubernetes.io/projected/fc7b4477-ff24-462e-8d3e-99794719fd37-kube-api-access-86j2q\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971538 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971548 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971559 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc7b4477-ff24-462e-8d3e-99794719fd37-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971571 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971581 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/94a52a94-56b0-4dc9-9804-020a890b1fff-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971590 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971606 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/94a52a94-56b0-4dc9-9804-020a890b1fff-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.971613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.973043 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.973146 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.973247 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.974201 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd" (OuterVolumeSpecName: "kube-api-access-bs5wd") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "kube-api-access-bs5wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.976066 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets" (OuterVolumeSpecName: "secrets") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.977749 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "94a52a94-56b0-4dc9-9804-020a890b1fff" (UID: "94a52a94-56b0-4dc9-9804-020a890b1fff"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.987766 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.989949 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 03 07:20:12 crc kubenswrapper[4810]: I1003 07:20:12.998430 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.016678 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "064ce5a9-4b27-4851-93ea-aa8c3f038eb8" (UID: "064ce5a9-4b27-4851-93ea-aa8c3f038eb8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072826 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072864 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072875 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072886 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072908 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072918 4810 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072926 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072934 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/94a52a94-56b0-4dc9-9804-020a890b1fff-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072944 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5wd\" (UniqueName: \"kubernetes.io/projected/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kube-api-access-bs5wd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072952 4810 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.072960 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/064ce5a9-4b27-4851-93ea-aa8c3f038eb8-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.087301 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.173946 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.216520 4810 generic.go:334] "Generic (PLEG): container finished" podID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerID="35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f" exitCode=0 Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.216587 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.216603 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerDied","Data":"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.216659 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"94a52a94-56b0-4dc9-9804-020a890b1fff","Type":"ContainerDied","Data":"d341764c4d2c9dfccb600b236d913b4e6d6213ab3a29426f4ad741f70beb4a4c"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.216679 4810 scope.go:117] "RemoveContainer" containerID="35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.228758 4810 generic.go:334] "Generic (PLEG): container finished" podID="a856067b-dba0-4860-93c3-ff4650760a4b" containerID="accc7eedd538c9a716310f485205e12a3004a700295626a4fb4494a1c3cf497b" exitCode=0 Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.228820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9f9f6fd8-z8ls6" event={"ID":"a856067b-dba0-4860-93c3-ff4650760a4b","Type":"ContainerDied","Data":"accc7eedd538c9a716310f485205e12a3004a700295626a4fb4494a1c3cf497b"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.235225 4810 generic.go:334] "Generic (PLEG): container finished" podID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerID="7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec" exitCode=0 Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.235289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerDied","Data":"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.235298 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.235317 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"37cd32da-b730-4e57-a2e0-41bf95ff8ca1","Type":"ContainerDied","Data":"8d08b5b89b59bdbf02c3c7b5d19cb457dcc42de2ffcd1dbfaddd56151c5a6d76"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.262110 4810 generic.go:334] "Generic (PLEG): container finished" podID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerID="7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9" exitCode=0 Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.262231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerDied","Data":"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.262247 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.262261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"064ce5a9-4b27-4851-93ea-aa8c3f038eb8","Type":"ContainerDied","Data":"a94510aec3efc19ce3f067e90ef6e3002b25e5aa00e04d35cecbb4cc79fb849c"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.293155 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_fc7b4477-ff24-462e-8d3e-99794719fd37/ovn-northd/0.log" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.293241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"fc7b4477-ff24-462e-8d3e-99794719fd37","Type":"ContainerDied","Data":"c6e3acd5a0de307723b03b1a4f782dddf205210f8198e24dfb72b544bee75f21"} Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.293330 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.325696 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e" path="/var/lib/kubelet/pods/a4006d6f-3bb2-4bb4-b7e7-2572cd7d481e/volumes" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.590151 4810 scope.go:117] "RemoveContainer" containerID="b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.692317 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.694807 4810 scope.go:117] "RemoveContainer" containerID="35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.699321 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.705412 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f\": container with ID starting with 35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f not found: ID does not exist" containerID="35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.705451 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f"} err="failed to get container status \"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f\": rpc error: code = NotFound desc = could not find container \"35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f\": container with ID starting with 35504826a5ead3a2f1996ef82483c9e6dadc00d4fd9064f6d241694f58a0d30f not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.705469 4810 scope.go:117] "RemoveContainer" containerID="b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.709530 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.709610 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0\": container with ID starting with b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0 not found: ID does not exist" containerID="b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.709632 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0"} err="failed to get container status \"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0\": rpc error: code = NotFound desc = could not find container \"b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0\": container with ID starting with b08242b18f155961e5f311f867c340499f7bfd7b4a7fc7862706122d819a7cf0 not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.709657 4810 scope.go:117] "RemoveContainer" containerID="7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.712039 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.723010 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.736076 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.756068 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-db7b99d8d-9txpw" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": dial tcp 10.217.0.158:9311: i/o timeout" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.756372 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.756478 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-db7b99d8d-9txpw" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.158:9311/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.759761 4810 scope.go:117] "RemoveContainer" containerID="6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.764537 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.773883 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795772 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795838 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795871 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795927 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795978 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trkk7\" (UniqueName: \"kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.795997 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.796029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys\") pod \"a856067b-dba0-4860-93c3-ff4650760a4b\" (UID: \"a856067b-dba0-4860-93c3-ff4650760a4b\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.802938 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.803319 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.804681 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts" (OuterVolumeSpecName: "scripts") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.808788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7" (OuterVolumeSpecName: "kube-api-access-trkk7") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "kube-api-access-trkk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.823374 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data" (OuterVolumeSpecName: "config-data") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.853090 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.853127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.855557 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a856067b-dba0-4860-93c3-ff4650760a4b" (UID: "a856067b-dba0-4860-93c3-ff4650760a4b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.885541 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.898980 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899011 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899020 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899032 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899042 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trkk7\" (UniqueName: \"kubernetes.io/projected/a856067b-dba0-4860-93c3-ff4650760a4b-kube-api-access-trkk7\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899051 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899059 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.899067 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a856067b-dba0-4860-93c3-ff4650760a4b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.922174 4810 scope.go:117] "RemoveContainer" containerID="7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec" Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.922695 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec\": container with ID starting with 7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec not found: ID does not exist" containerID="7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.922727 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec"} err="failed to get container status \"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec\": rpc error: code = NotFound desc = could not find container \"7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec\": container with ID starting with 7e859daeea290eb1eb12c2623eae3ac02395270be3a994ebc35204cd828dd8ec not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.922745 4810 scope.go:117] "RemoveContainer" containerID="6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3" Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.923093 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3\": container with ID starting with 6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3 not found: ID does not exist" containerID="6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.923116 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3"} err="failed to get container status \"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3\": rpc error: code = NotFound desc = could not find container \"6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3\": container with ID starting with 6e3d96e22316dd1efcae653fa53f8defb78dd078e621618c5109f3ac3c88c7c3 not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.923129 4810 scope.go:117] "RemoveContainer" containerID="7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.951207 4810 scope.go:117] "RemoveContainer" containerID="1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.983906 4810 scope.go:117] "RemoveContainer" containerID="7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9" Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.984402 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9\": container with ID starting with 7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9 not found: ID does not exist" containerID="7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.984440 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9"} err="failed to get container status \"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9\": rpc error: code = NotFound desc = could not find container \"7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9\": container with ID starting with 7bac09cef4eaa399742365740475c7beb36be87c0d13e5753976f5d853e794d9 not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.984465 4810 scope.go:117] "RemoveContainer" containerID="1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7" Oct 03 07:20:13 crc kubenswrapper[4810]: E1003 07:20:13.984739 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7\": container with ID starting with 1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7 not found: ID does not exist" containerID="1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.984822 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7"} err="failed to get container status \"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7\": rpc error: code = NotFound desc = could not find container \"1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7\": container with ID starting with 1c55f8051c7d824e646843e5f5e35de162e0ba0a6349e944eada3808f9d4ecf7 not found: ID does not exist" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.984836 4810 scope.go:117] "RemoveContainer" containerID="1fc2cb239867d8112861fe2339c64ebf43664be1f37eb5d32adf5e8d9a6244a0" Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.999718 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:13 crc kubenswrapper[4810]: I1003 07:20:13.999800 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000178 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000222 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000256 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000293 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scmrn\" (UniqueName: \"kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000314 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000336 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data\") pod \"5a422cc8-26fd-4749-9648-c00d6f5c9009\" (UID: \"5a422cc8-26fd-4749-9648-c00d6f5c9009\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000246 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.000522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.003975 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts" (OuterVolumeSpecName: "scripts") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.004509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn" (OuterVolumeSpecName: "kube-api-access-scmrn") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "kube-api-access-scmrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.007196 4810 scope.go:117] "RemoveContainer" containerID="ddafc1588b05f272d8068696a1b71ca7a115233ca97b20484dd445c7940676f7" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.024947 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.064693 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.077873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.090454 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data" (OuterVolumeSpecName: "config-data") pod "5a422cc8-26fd-4749-9648-c00d6f5c9009" (UID: "5a422cc8-26fd-4749-9648-c00d6f5c9009"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101671 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101698 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scmrn\" (UniqueName: \"kubernetes.io/projected/5a422cc8-26fd-4749-9648-c00d6f5c9009-kube-api-access-scmrn\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101711 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101724 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101737 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101747 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101758 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a422cc8-26fd-4749-9648-c00d6f5c9009-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.101769 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a422cc8-26fd-4749-9648-c00d6f5c9009-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.110069 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.175470 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202159 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wb7\" (UniqueName: \"kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7\") pod \"589491ba-eae0-427c-8124-cacdcabd03c0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle\") pod \"589491ba-eae0-427c-8124-cacdcabd03c0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202230 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data\") pod \"6c0ac24c-ed16-445a-b135-41b696510fc8\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202249 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw295\" (UniqueName: \"kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295\") pod \"6c0ac24c-ed16-445a-b135-41b696510fc8\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202315 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data\") pod \"589491ba-eae0-427c-8124-cacdcabd03c0\" (UID: \"589491ba-eae0-427c-8124-cacdcabd03c0\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.202359 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle\") pod \"6c0ac24c-ed16-445a-b135-41b696510fc8\" (UID: \"6c0ac24c-ed16-445a-b135-41b696510fc8\") " Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.239228 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7" (OuterVolumeSpecName: "kube-api-access-72wb7") pod "589491ba-eae0-427c-8124-cacdcabd03c0" (UID: "589491ba-eae0-427c-8124-cacdcabd03c0"). InnerVolumeSpecName "kube-api-access-72wb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.240552 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295" (OuterVolumeSpecName: "kube-api-access-nw295") pod "6c0ac24c-ed16-445a-b135-41b696510fc8" (UID: "6c0ac24c-ed16-445a-b135-41b696510fc8"). InnerVolumeSpecName "kube-api-access-nw295". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.243201 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data" (OuterVolumeSpecName: "config-data") pod "589491ba-eae0-427c-8124-cacdcabd03c0" (UID: "589491ba-eae0-427c-8124-cacdcabd03c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.243651 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data" (OuterVolumeSpecName: "config-data") pod "6c0ac24c-ed16-445a-b135-41b696510fc8" (UID: "6c0ac24c-ed16-445a-b135-41b696510fc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.244574 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "589491ba-eae0-427c-8124-cacdcabd03c0" (UID: "589491ba-eae0-427c-8124-cacdcabd03c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.275020 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c0ac24c-ed16-445a-b135-41b696510fc8" (UID: "6c0ac24c-ed16-445a-b135-41b696510fc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303247 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303275 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303285 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wb7\" (UniqueName: \"kubernetes.io/projected/589491ba-eae0-427c-8124-cacdcabd03c0-kube-api-access-72wb7\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303296 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/589491ba-eae0-427c-8124-cacdcabd03c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303305 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0ac24c-ed16-445a-b135-41b696510fc8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.303313 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw295\" (UniqueName: \"kubernetes.io/projected/6c0ac24c-ed16-445a-b135-41b696510fc8-kube-api-access-nw295\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.305322 4810 generic.go:334] "Generic (PLEG): container finished" podID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" exitCode=0 Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.305369 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.305362 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6c0ac24c-ed16-445a-b135-41b696510fc8","Type":"ContainerDied","Data":"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.305482 4810 scope.go:117] "RemoveContainer" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.305412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6c0ac24c-ed16-445a-b135-41b696510fc8","Type":"ContainerDied","Data":"5b4ea0be33fbe46ebdbfeeec75b259e170352ba801146c919041ff92c32572c2"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.309771 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerID="6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f" exitCode=0 Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.309885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerDied","Data":"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.309979 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a422cc8-26fd-4749-9648-c00d6f5c9009","Type":"ContainerDied","Data":"354f6c76572defe4220c18e3144a43c5ea9902962319109cf6c16f3e4ff24a02"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.310091 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.313035 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b9f9f6fd8-z8ls6" event={"ID":"a856067b-dba0-4860-93c3-ff4650760a4b","Type":"ContainerDied","Data":"c2b56e5b7c279a06313e12ec070d16cdcd9658d80d73bb0c08f6f925a91535f6"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.313153 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b9f9f6fd8-z8ls6" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.325227 4810 generic.go:334] "Generic (PLEG): container finished" podID="589491ba-eae0-427c-8124-cacdcabd03c0" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" exitCode=0 Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.325380 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"589491ba-eae0-427c-8124-cacdcabd03c0","Type":"ContainerDied","Data":"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.325464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"589491ba-eae0-427c-8124-cacdcabd03c0","Type":"ContainerDied","Data":"c322c03b1f868e344abca0f607ab2296b22ae489d6731a7d4d90393f735d359c"} Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.325559 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.366075 4810 scope.go:117] "RemoveContainer" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.366521 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924\": container with ID starting with 83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924 not found: ID does not exist" containerID="83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.366603 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924"} err="failed to get container status \"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924\": rpc error: code = NotFound desc = could not find container \"83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924\": container with ID starting with 83304e6cbae0506eb5b184b966d7a2909114a78470650286f0ee4bb804858924 not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.366686 4810 scope.go:117] "RemoveContainer" containerID="2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.376583 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.382170 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5b9f9f6fd8-z8ls6"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.389759 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.395309 4810 scope.go:117] "RemoveContainer" containerID="da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.401230 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.406278 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.411474 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.414912 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.415132 4810 scope.go:117] "RemoveContainer" containerID="6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.418774 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.431392 4810 scope.go:117] "RemoveContainer" containerID="23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.447279 4810 scope.go:117] "RemoveContainer" containerID="2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.447644 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349\": container with ID starting with 2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349 not found: ID does not exist" containerID="2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.447685 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349"} err="failed to get container status \"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349\": rpc error: code = NotFound desc = could not find container \"2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349\": container with ID starting with 2a6220cb6359733c48118f7a2fa3fa5f2563dd79b635f93fa37ccc4a21585349 not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.447712 4810 scope.go:117] "RemoveContainer" containerID="da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.448148 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba\": container with ID starting with da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba not found: ID does not exist" containerID="da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.448169 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba"} err="failed to get container status \"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba\": rpc error: code = NotFound desc = could not find container \"da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba\": container with ID starting with da51d9dba571553a4fb309487e6ba9396b772a1a370eb1c2b0efd1ecd0ace4ba not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.448185 4810 scope.go:117] "RemoveContainer" containerID="6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.448521 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f\": container with ID starting with 6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f not found: ID does not exist" containerID="6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.448636 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f"} err="failed to get container status \"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f\": rpc error: code = NotFound desc = could not find container \"6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f\": container with ID starting with 6ab20cd55a7031cabe2fdc55d65230e3e305cf2223c1b94d641b9fb89488317f not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.448738 4810 scope.go:117] "RemoveContainer" containerID="23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.449191 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb\": container with ID starting with 23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb not found: ID does not exist" containerID="23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.449219 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb"} err="failed to get container status \"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb\": rpc error: code = NotFound desc = could not find container \"23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb\": container with ID starting with 23aed09a613d023a9b2a0144eef75a2402cbb61e2714c4c2a68cdd97ea5198bb not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.449234 4810 scope.go:117] "RemoveContainer" containerID="accc7eedd538c9a716310f485205e12a3004a700295626a4fb4494a1c3cf497b" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.474218 4810 scope.go:117] "RemoveContainer" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.493787 4810 scope.go:117] "RemoveContainer" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.494196 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1\": container with ID starting with 58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1 not found: ID does not exist" containerID="58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1" Oct 03 07:20:14 crc kubenswrapper[4810]: I1003 07:20:14.494226 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1"} err="failed to get container status \"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1\": rpc error: code = NotFound desc = could not find container \"58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1\": container with ID starting with 58b269eebaf60bac271459aaa5650588ba71ce066953010279b4b0d1ce393bc1 not found: ID does not exist" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.719760 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.720277 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.720522 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.720545 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.722236 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.723639 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.724880 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:14 crc kubenswrapper[4810]: E1003 07:20:14.724957 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.312437 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" path="/var/lib/kubelet/pods/064ce5a9-4b27-4851-93ea-aa8c3f038eb8/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.313201 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" path="/var/lib/kubelet/pods/37cd32da-b730-4e57-a2e0-41bf95ff8ca1/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.314260 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" path="/var/lib/kubelet/pods/589491ba-eae0-427c-8124-cacdcabd03c0/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.314794 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" path="/var/lib/kubelet/pods/5a422cc8-26fd-4749-9648-c00d6f5c9009/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.315499 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" path="/var/lib/kubelet/pods/6c0ac24c-ed16-445a-b135-41b696510fc8/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.316842 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" path="/var/lib/kubelet/pods/94a52a94-56b0-4dc9-9804-020a890b1fff/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.317371 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a856067b-dba0-4860-93c3-ff4650760a4b" path="/var/lib/kubelet/pods/a856067b-dba0-4860-93c3-ff4650760a4b/volumes" Oct 03 07:20:15 crc kubenswrapper[4810]: I1003 07:20:15.317982 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" path="/var/lib/kubelet/pods/fc7b4477-ff24-462e-8d3e-99794719fd37/volumes" Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.721996 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.723707 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.723721 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.724412 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.724663 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.726083 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.727753 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:19 crc kubenswrapper[4810]: E1003 07:20:19.728085 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.285004 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.441139 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerID="2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81" exitCode=0 Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.441213 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerDied","Data":"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81"} Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.441245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-584cb866f5-5rvp9" event={"ID":"5a145896-686f-4bb6-b44b-fac558a2f4fd","Type":"ContainerDied","Data":"7ed755887fa36cdbd6d8edaec5e0f363ffb843387c1fb832fa183893eb1ffdcb"} Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.441265 4810 scope.go:117] "RemoveContainer" containerID="98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.441217 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-584cb866f5-5rvp9" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.461109 4810 scope.go:117] "RemoveContainer" containerID="2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470208 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470296 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pltk\" (UniqueName: \"kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470366 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470419 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470483 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.470643 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config\") pod \"5a145896-686f-4bb6-b44b-fac558a2f4fd\" (UID: \"5a145896-686f-4bb6-b44b-fac558a2f4fd\") " Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.474668 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk" (OuterVolumeSpecName: "kube-api-access-6pltk") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "kube-api-access-6pltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.474857 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.486999 4810 scope.go:117] "RemoveContainer" containerID="98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a" Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.487571 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a\": container with ID starting with 98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a not found: ID does not exist" containerID="98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.487609 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a"} err="failed to get container status \"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a\": rpc error: code = NotFound desc = could not find container \"98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a\": container with ID starting with 98405850c14f1dab796f1d332217b1a3068016102506bfb5509ecac5f6d30c7a not found: ID does not exist" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.487631 4810 scope.go:117] "RemoveContainer" containerID="2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81" Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.488134 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81\": container with ID starting with 2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81 not found: ID does not exist" containerID="2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.488182 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81"} err="failed to get container status \"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81\": rpc error: code = NotFound desc = could not find container \"2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81\": container with ID starting with 2a12ecdcb725a01c0f523c90b8a5c68fcd71c3a2cbc0c57377ea54f174d00f81 not found: ID does not exist" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.512676 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.517216 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config" (OuterVolumeSpecName: "config") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.518002 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.526153 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.542719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5a145896-686f-4bb6-b44b-fac558a2f4fd" (UID: "5a145896-686f-4bb6-b44b-fac558a2f4fd"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572352 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572398 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572415 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-config\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572428 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pltk\" (UniqueName: \"kubernetes.io/projected/5a145896-686f-4bb6-b44b-fac558a2f4fd-kube-api-access-6pltk\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572441 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572451 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.572461 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a145896-686f-4bb6-b44b-fac558a2f4fd-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.720329 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.721053 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.721400 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.721476 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.721668 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.723274 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.724633 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:24 crc kubenswrapper[4810]: E1003 07:20:24.724714 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.781457 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:20:24 crc kubenswrapper[4810]: I1003 07:20:24.789798 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-584cb866f5-5rvp9"] Oct 03 07:20:25 crc kubenswrapper[4810]: I1003 07:20:25.317400 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" path="/var/lib/kubelet/pods/5a145896-686f-4bb6-b44b-fac558a2f4fd/volumes" Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.720098 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.721159 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.721655 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.721709 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.723323 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.725326 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.727522 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 03 07:20:29 crc kubenswrapper[4810]: E1003 07:20:29.727563 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-s8f2s" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:32 crc kubenswrapper[4810]: I1003 07:20:32.088965 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:20:32 crc kubenswrapper[4810]: I1003 07:20:32.090439 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.319030 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s8f2s_5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c/ovs-vswitchd/0.log" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.319933 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.509845 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510127 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510185 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz9gv\" (UniqueName: \"kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510214 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510285 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510316 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run\") pod \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\" (UID: \"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c\") " Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510322 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log" (OuterVolumeSpecName: "var-log") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib" (OuterVolumeSpecName: "var-lib") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run" (OuterVolumeSpecName: "var-run") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510746 4810 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-log\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510768 4810 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-lib\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510786 4810 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.510802 4810 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.511534 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts" (OuterVolumeSpecName: "scripts") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.520390 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv" (OuterVolumeSpecName: "kube-api-access-fz9gv") pod "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" (UID: "5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c"). InnerVolumeSpecName "kube-api-access-fz9gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.551380 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s8f2s_5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c/ovs-vswitchd/0.log" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.552526 4810 generic.go:334] "Generic (PLEG): container finished" podID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" exitCode=137 Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.552570 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerDied","Data":"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c"} Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.552597 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s8f2s" event={"ID":"5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c","Type":"ContainerDied","Data":"c91af75b681d9da00496316aa26097786cb96c87179bd1b1811efde6271d4a3f"} Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.552614 4810 scope.go:117] "RemoveContainer" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.552625 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s8f2s" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.597145 4810 scope.go:117] "RemoveContainer" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.599967 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.611734 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-s8f2s"] Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.612886 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.612973 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz9gv\" (UniqueName: \"kubernetes.io/projected/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c-kube-api-access-fz9gv\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.624348 4810 scope.go:117] "RemoveContainer" containerID="192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.662274 4810 scope.go:117] "RemoveContainer" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" Oct 03 07:20:33 crc kubenswrapper[4810]: E1003 07:20:33.663735 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c\": container with ID starting with 444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c not found: ID does not exist" containerID="444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.663791 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c"} err="failed to get container status \"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c\": rpc error: code = NotFound desc = could not find container \"444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c\": container with ID starting with 444a98b189cad32041c8ba0c3ef497f4f36af3da08759e0d30886d0d8dcd164c not found: ID does not exist" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.663830 4810 scope.go:117] "RemoveContainer" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" Oct 03 07:20:33 crc kubenswrapper[4810]: E1003 07:20:33.664184 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0\": container with ID starting with f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 not found: ID does not exist" containerID="f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.664224 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0"} err="failed to get container status \"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0\": rpc error: code = NotFound desc = could not find container \"f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0\": container with ID starting with f4c923d2db9a96c6ce7c8598e0c81bbd6041185bc91e6bb89a9abede4f378ef0 not found: ID does not exist" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.664251 4810 scope.go:117] "RemoveContainer" containerID="192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9" Oct 03 07:20:33 crc kubenswrapper[4810]: E1003 07:20:33.664963 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9\": container with ID starting with 192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9 not found: ID does not exist" containerID="192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9" Oct 03 07:20:33 crc kubenswrapper[4810]: I1003 07:20:33.665004 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9"} err="failed to get container status \"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9\": rpc error: code = NotFound desc = could not find container \"192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9\": container with ID starting with 192fa0dd771fb15343345fa87232bb9f4a5af1757eed86bd8195a651106ab9e9 not found: ID does not exist" Oct 03 07:20:34 crc kubenswrapper[4810]: I1003 07:20:34.567342 4810 generic.go:334] "Generic (PLEG): container finished" podID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerID="1dc52a1d91d58d42a133fd8634f92be4368fdda5c3e244244cb985f1a0e500b4" exitCode=137 Oct 03 07:20:34 crc kubenswrapper[4810]: I1003 07:20:34.567512 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"1dc52a1d91d58d42a133fd8634f92be4368fdda5c3e244244cb985f1a0e500b4"} Oct 03 07:20:34 crc kubenswrapper[4810]: I1003 07:20:34.875427 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.034998 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.035095 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock\") pod \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.035224 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") pod \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.035284 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache\") pod \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.035348 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfbdm\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm\") pod \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\" (UID: \"3bc41979-74c8-4736-bdb0-bb6c66837ad2\") " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.035967 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache" (OuterVolumeSpecName: "cache") pod "3bc41979-74c8-4736-bdb0-bb6c66837ad2" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.036159 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock" (OuterVolumeSpecName: "lock") pod "3bc41979-74c8-4736-bdb0-bb6c66837ad2" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.039398 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "3bc41979-74c8-4736-bdb0-bb6c66837ad2" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.039926 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3bc41979-74c8-4736-bdb0-bb6c66837ad2" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.039997 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm" (OuterVolumeSpecName: "kube-api-access-sfbdm") pod "3bc41979-74c8-4736-bdb0-bb6c66837ad2" (UID: "3bc41979-74c8-4736-bdb0-bb6c66837ad2"). InnerVolumeSpecName "kube-api-access-sfbdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.137329 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.137378 4810 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-cache\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.137395 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfbdm\" (UniqueName: \"kubernetes.io/projected/3bc41979-74c8-4736-bdb0-bb6c66837ad2-kube-api-access-sfbdm\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.137439 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.137451 4810 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3bc41979-74c8-4736-bdb0-bb6c66837ad2-lock\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.157104 4810 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.239206 4810 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.312460 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" path="/var/lib/kubelet/pods/5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c/volumes" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.594485 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3bc41979-74c8-4736-bdb0-bb6c66837ad2","Type":"ContainerDied","Data":"3c7bab0219b1d7efab0e67757983842a7cfffd5a6a394d9479ccce8a61dd6084"} Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.594574 4810 scope.go:117] "RemoveContainer" containerID="1dc52a1d91d58d42a133fd8634f92be4368fdda5c3e244244cb985f1a0e500b4" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.594696 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.636391 4810 scope.go:117] "RemoveContainer" containerID="889f53c414198ba8568ddac1401c1afddd20f9d138e22cef33957e8a0c27a046" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.645176 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.650690 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.678266 4810 scope.go:117] "RemoveContainer" containerID="895aa7e546ccbeb235b2c62051d1f3cb4581f5dca2fdbb37fa958319aba74571" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.717604 4810 scope.go:117] "RemoveContainer" containerID="888e7fe71210eba8464268fac89d6874d77e318e83fa8a722297b8a5dbe5627c" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.742730 4810 scope.go:117] "RemoveContainer" containerID="ac5b608d8a7aa92bd915544af8c042f3b63ff7151a33ff4dbb17a4a861e889b3" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.765741 4810 scope.go:117] "RemoveContainer" containerID="2fea3519604050a7d09779c938e0622ac0f41e5a7bf1dd301b0b7a60c049d624" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.788595 4810 scope.go:117] "RemoveContainer" containerID="ccf54b0b656baac5b5fa788f8029adbf50760366a62fbd65bb43a3456b3066a8" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.806870 4810 scope.go:117] "RemoveContainer" containerID="46fe542f82adb9dd0f09edd022b0f4c932309d5e5f689981a10664ea462e5666" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.826242 4810 scope.go:117] "RemoveContainer" containerID="260956333a59a33c2b6828c4e5152fe36a01a3f751272837729f275e4afc7b35" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.844445 4810 scope.go:117] "RemoveContainer" containerID="45e3ddc52af834f603045aa999d0eb049c646473758731b30f7e8e2af55782fa" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.865418 4810 scope.go:117] "RemoveContainer" containerID="2b0ce3947967fa96d846a63bd3163c855b442489cfeea6683ecb041c0f9cf8f1" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.886400 4810 scope.go:117] "RemoveContainer" containerID="9974961e6abf7febdcf8e34a0c97b87742b29d09441f5766a4da92668b493130" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.905306 4810 scope.go:117] "RemoveContainer" containerID="0f9c3077ad8cecdd534fca11fbffb1bf485c180ae8d7129b25caffd483249975" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.928928 4810 scope.go:117] "RemoveContainer" containerID="a6d2ec4008f34f574e9c7b7b8a4964291b10701cd744120f6b95298162e1741c" Oct 03 07:20:35 crc kubenswrapper[4810]: I1003 07:20:35.957481 4810 scope.go:117] "RemoveContainer" containerID="bccbd5c0aebf7a8d7a2c209dbc4d28096af4feec63ee3acb6fb99edbed1f7128" Oct 03 07:20:37 crc kubenswrapper[4810]: I1003 07:20:37.317049 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" path="/var/lib/kubelet/pods/3bc41979-74c8-4736-bdb0-bb6c66837ad2/volumes" Oct 03 07:20:38 crc kubenswrapper[4810]: I1003 07:20:38.020287 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod2722ac3d-f149-4926-9b5e-cb43d477c15a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod2722ac3d-f149-4926-9b5e-cb43d477c15a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2722ac3d_f149_4926_9b5e_cb43d477c15a.slice" Oct 03 07:20:38 crc kubenswrapper[4810]: E1003 07:20:38.020832 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod2722ac3d-f149-4926-9b5e-cb43d477c15a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod2722ac3d-f149-4926-9b5e-cb43d477c15a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod2722ac3d_f149_4926_9b5e_cb43d477c15a.slice" pod="openstack/ovsdbserver-sb-0" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" Oct 03 07:20:38 crc kubenswrapper[4810]: I1003 07:20:38.632588 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 07:20:38 crc kubenswrapper[4810]: I1003 07:20:38.693517 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:20:38 crc kubenswrapper[4810]: I1003 07:20:38.703626 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 07:20:39 crc kubenswrapper[4810]: I1003 07:20:39.318723 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2722ac3d-f149-4926-9b5e-cb43d477c15a" path="/var/lib/kubelet/pods/2722ac3d-f149-4926-9b5e-cb43d477c15a/volumes" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125424 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125780 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125793 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125811 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-reaper" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125817 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-reaper" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125827 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125833 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125842 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-central-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125848 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-central-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125856 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125861 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125872 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebbf1a25-79f6-42f3-af72-ff0e33094ecd" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125877 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebbf1a25-79f6-42f3-af72-ff0e33094ecd" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125885 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125932 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125946 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125952 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-server" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125965 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server-init" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125973 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server-init" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.125981 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.125986 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126000 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="sg-core" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126006 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="sg-core" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126014 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a856067b-dba0-4860-93c3-ff4650760a4b" containerName="keystone-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126020 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a856067b-dba0-4860-93c3-ff4650760a4b" containerName="keystone-api" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126027 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerName="nova-scheduler-scheduler" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126033 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerName="nova-scheduler-scheduler" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126043 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126049 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126057 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126064 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126074 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b493c984-b8a7-4a78-9210-fc44ecf70eec" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126080 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b493c984-b8a7-4a78-9210-fc44ecf70eec" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126088 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="swift-recon-cron" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126093 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="swift-recon-cron" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126102 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6eb4ae-ef62-480d-80a9-411be811155b" containerName="kube-state-metrics" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126107 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6eb4ae-ef62-480d-80a9-411be811155b" containerName="kube-state-metrics" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126117 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa9089-99f4-4d50-8fe8-5f4492fe3767" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126123 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa9089-99f4-4d50-8fe8-5f4492fe3767" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126132 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126138 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-server" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126310 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126317 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126324 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126330 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126338 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-expirer" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126344 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-expirer" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126350 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126371 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126378 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" containerName="nova-cell0-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126384 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" containerName="nova-cell0-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126394 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="proxy-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126399 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="proxy-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126410 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126416 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126423 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126429 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126439 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerName="nova-cell1-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126445 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerName="nova-cell1-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126454 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126460 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126467 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126473 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126484 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126491 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126497 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126502 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126509 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-notification-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126515 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-notification-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126525 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126531 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126539 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126545 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-api" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126557 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126562 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126569 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126575 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-api" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126583 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="mysql-bootstrap" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126589 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="mysql-bootstrap" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126597 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126602 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126611 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="setup-container" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126617 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="setup-container" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126623 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126629 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-server" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126638 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="setup-container" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126645 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="setup-container" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126654 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126660 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126671 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126677 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126687 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="galera" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126693 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="galera" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126702 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" containerName="memcached" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126707 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" containerName="memcached" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126717 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="rsync" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126722 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="rsync" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126731 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126737 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126744 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="openstack-network-exporter" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126750 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="openstack-network-exporter" Oct 03 07:20:40 crc kubenswrapper[4810]: E1003 07:20:40.126755 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126763 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126939 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="064ce5a9-4b27-4851-93ea-aa8c3f038eb8" containerName="galera" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126952 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a52a94-56b0-4dc9-9804-020a890b1fff" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126959 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-reaper" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126969 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="rsync" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126976 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fa9089-99f4-4d50-8fe8-5f4492fe3767" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126984 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cd32da-b730-4e57-a2e0-41bf95ff8ca1" containerName="rabbitmq" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.126992 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-notification-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127002 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a856067b-dba0-4860-93c3-ff4650760a4b" containerName="keystone-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127012 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b493c984-b8a7-4a78-9210-fc44ecf70eec" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127020 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127029 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127036 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cde0e0d-aa4d-45e1-bac8-52225a6c997c" containerName="barbican-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127045 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127054 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127060 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127069 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127076 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127084 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="sg-core" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127092 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127102 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127108 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127117 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-api" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127124 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fce1d94-bf74-40fe-a1bf-9e03b2ad92e1" containerName="memcached" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127131 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="swift-recon-cron" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127137 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="ceilometer-central-agent" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127143 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127149 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a145896-686f-4bb6-b44b-fac558a2f4fd" containerName="neutron-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127157 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43a63ef-cea9-48b2-af35-04fe0a365fa5" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127162 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c723fc6-aae6-4032-afba-622dd5ea4dbc" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127171 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="openstack-network-exporter" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127179 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="589491ba-eae0-427c-8124-cacdcabd03c0" containerName="nova-cell0-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127187 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0ac24c-ed16-445a-b135-41b696510fc8" containerName="nova-cell1-conductor-conductor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127193 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc7b4477-ff24-462e-8d3e-99794719fd37" containerName="ovn-northd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127203 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebbf1a25-79f6-42f3-af72-ff0e33094ecd" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127209 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-expirer" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.127216 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovsdb-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129023 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-updater" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129033 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="account-server" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129041 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7c316c-a9f6-4fa2-b150-83d6b65a9d25" containerName="nova-api-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129050 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="object-replicator" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129060 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a78c823-e408-41d7-9f0f-3a89b648dae7" containerName="glance-log" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129067 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a422cc8-26fd-4749-9648-c00d6f5c9009" containerName="proxy-httpd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129077 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6eb4ae-ef62-480d-80a9-411be811155b" containerName="kube-state-metrics" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129084 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac7cdb2-a224-4fcb-adc7-3ceedd74f47c" containerName="ovs-vswitchd" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129091 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b07401-8647-4f71-a3a8-ef7e2e72e171" containerName="nova-scheduler-scheduler" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129099 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc41979-74c8-4736-bdb0-bb6c66837ad2" containerName="container-auditor" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.129107 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbda0a2-4e5b-4fce-9a4b-3a36a632d0fa" containerName="mariadb-account-delete" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.130934 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.131027 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.319854 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.319998 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.320116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gzv2\" (UniqueName: \"kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.421691 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.421750 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gzv2\" (UniqueName: \"kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.421814 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.422334 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.422365 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.446861 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gzv2\" (UniqueName: \"kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2\") pod \"certified-operators-gvtj2\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.458913 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:40 crc kubenswrapper[4810]: I1003 07:20:40.950098 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:40 crc kubenswrapper[4810]: W1003 07:20:40.961124 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932e165e_5619_4e0b_b41c_849e402919ac.slice/crio-7aa103e339488b118a031f2217fe66f72db5ab0b29f3e68082351b194655bc9b WatchSource:0}: Error finding container 7aa103e339488b118a031f2217fe66f72db5ab0b29f3e68082351b194655bc9b: Status 404 returned error can't find the container with id 7aa103e339488b118a031f2217fe66f72db5ab0b29f3e68082351b194655bc9b Oct 03 07:20:41 crc kubenswrapper[4810]: I1003 07:20:41.674637 4810 generic.go:334] "Generic (PLEG): container finished" podID="932e165e-5619-4e0b-b41c-849e402919ac" containerID="473ea1505f761c22c01a28a1104819b1315dcdbbb626c0195942905a42583937" exitCode=0 Oct 03 07:20:41 crc kubenswrapper[4810]: I1003 07:20:41.674701 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerDied","Data":"473ea1505f761c22c01a28a1104819b1315dcdbbb626c0195942905a42583937"} Oct 03 07:20:41 crc kubenswrapper[4810]: I1003 07:20:41.674772 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerStarted","Data":"7aa103e339488b118a031f2217fe66f72db5ab0b29f3e68082351b194655bc9b"} Oct 03 07:20:42 crc kubenswrapper[4810]: I1003 07:20:42.683003 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerStarted","Data":"741335be8bb628c9492d128e18e37d5324d2985ed6e7bfcf9d1283a7abf3d295"} Oct 03 07:20:42 crc kubenswrapper[4810]: E1003 07:20:42.975063 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod932e165e_5619_4e0b_b41c_849e402919ac.slice/crio-conmon-741335be8bb628c9492d128e18e37d5324d2985ed6e7bfcf9d1283a7abf3d295.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:20:43 crc kubenswrapper[4810]: I1003 07:20:43.697060 4810 generic.go:334] "Generic (PLEG): container finished" podID="932e165e-5619-4e0b-b41c-849e402919ac" containerID="741335be8bb628c9492d128e18e37d5324d2985ed6e7bfcf9d1283a7abf3d295" exitCode=0 Oct 03 07:20:43 crc kubenswrapper[4810]: I1003 07:20:43.697223 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerDied","Data":"741335be8bb628c9492d128e18e37d5324d2985ed6e7bfcf9d1283a7abf3d295"} Oct 03 07:20:44 crc kubenswrapper[4810]: I1003 07:20:44.711012 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerStarted","Data":"9a491d6da170b5251fc7191a95b505d2245c85c295159be08ce71a152b5e6065"} Oct 03 07:20:44 crc kubenswrapper[4810]: I1003 07:20:44.752430 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gvtj2" podStartSLOduration=2.273122517 podStartE2EDuration="4.752405104s" podCreationTimestamp="2025-10-03 07:20:40 +0000 UTC" firstStartedPulling="2025-10-03 07:20:41.677267236 +0000 UTC m=+1475.104517971" lastFinishedPulling="2025-10-03 07:20:44.156549793 +0000 UTC m=+1477.583800558" observedRunningTime="2025-10-03 07:20:44.747994995 +0000 UTC m=+1478.175245740" watchObservedRunningTime="2025-10-03 07:20:44.752405104 +0000 UTC m=+1478.179655849" Oct 03 07:20:50 crc kubenswrapper[4810]: I1003 07:20:50.459927 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:50 crc kubenswrapper[4810]: I1003 07:20:50.460506 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:50 crc kubenswrapper[4810]: I1003 07:20:50.543064 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:50 crc kubenswrapper[4810]: I1003 07:20:50.839756 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:50 crc kubenswrapper[4810]: I1003 07:20:50.889491 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:52 crc kubenswrapper[4810]: I1003 07:20:52.791541 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gvtj2" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="registry-server" containerID="cri-o://9a491d6da170b5251fc7191a95b505d2245c85c295159be08ce71a152b5e6065" gracePeriod=2 Oct 03 07:20:53 crc kubenswrapper[4810]: I1003 07:20:53.812313 4810 generic.go:334] "Generic (PLEG): container finished" podID="932e165e-5619-4e0b-b41c-849e402919ac" containerID="9a491d6da170b5251fc7191a95b505d2245c85c295159be08ce71a152b5e6065" exitCode=0 Oct 03 07:20:53 crc kubenswrapper[4810]: I1003 07:20:53.812380 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerDied","Data":"9a491d6da170b5251fc7191a95b505d2245c85c295159be08ce71a152b5e6065"} Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.045342 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.237043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities\") pod \"932e165e-5619-4e0b-b41c-849e402919ac\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.237098 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gzv2\" (UniqueName: \"kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2\") pod \"932e165e-5619-4e0b-b41c-849e402919ac\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.237265 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content\") pod \"932e165e-5619-4e0b-b41c-849e402919ac\" (UID: \"932e165e-5619-4e0b-b41c-849e402919ac\") " Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.238374 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities" (OuterVolumeSpecName: "utilities") pod "932e165e-5619-4e0b-b41c-849e402919ac" (UID: "932e165e-5619-4e0b-b41c-849e402919ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.246160 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2" (OuterVolumeSpecName: "kube-api-access-4gzv2") pod "932e165e-5619-4e0b-b41c-849e402919ac" (UID: "932e165e-5619-4e0b-b41c-849e402919ac"). InnerVolumeSpecName "kube-api-access-4gzv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.288340 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "932e165e-5619-4e0b-b41c-849e402919ac" (UID: "932e165e-5619-4e0b-b41c-849e402919ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.338630 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gzv2\" (UniqueName: \"kubernetes.io/projected/932e165e-5619-4e0b-b41c-849e402919ac-kube-api-access-4gzv2\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.338669 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.338682 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/932e165e-5619-4e0b-b41c-849e402919ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.828827 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gvtj2" event={"ID":"932e165e-5619-4e0b-b41c-849e402919ac","Type":"ContainerDied","Data":"7aa103e339488b118a031f2217fe66f72db5ab0b29f3e68082351b194655bc9b"} Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.828969 4810 scope.go:117] "RemoveContainer" containerID="9a491d6da170b5251fc7191a95b505d2245c85c295159be08ce71a152b5e6065" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.829193 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gvtj2" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.868290 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.871558 4810 scope.go:117] "RemoveContainer" containerID="741335be8bb628c9492d128e18e37d5324d2985ed6e7bfcf9d1283a7abf3d295" Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.875047 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gvtj2"] Oct 03 07:20:54 crc kubenswrapper[4810]: I1003 07:20:54.903177 4810 scope.go:117] "RemoveContainer" containerID="473ea1505f761c22c01a28a1104819b1315dcdbbb626c0195942905a42583937" Oct 03 07:20:55 crc kubenswrapper[4810]: I1003 07:20:55.318052 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="932e165e-5619-4e0b-b41c-849e402919ac" path="/var/lib/kubelet/pods/932e165e-5619-4e0b-b41c-849e402919ac/volumes" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.089085 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.090463 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.091320 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.092637 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.092836 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" gracePeriod=600 Oct 03 07:21:02 crc kubenswrapper[4810]: E1003 07:21:02.218685 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.940226 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" exitCode=0 Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.940301 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66"} Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.940712 4810 scope.go:117] "RemoveContainer" containerID="aeed1b22382dda73787b391b657e86398a4fea15563912da8f38296568e7c489" Oct 03 07:21:02 crc kubenswrapper[4810]: I1003 07:21:02.941531 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:21:02 crc kubenswrapper[4810]: E1003 07:21:02.942067 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.799962 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:04 crc kubenswrapper[4810]: E1003 07:21:04.800398 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="extract-content" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.800419 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="extract-content" Oct 03 07:21:04 crc kubenswrapper[4810]: E1003 07:21:04.800450 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="extract-utilities" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.800462 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="extract-utilities" Oct 03 07:21:04 crc kubenswrapper[4810]: E1003 07:21:04.800499 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="registry-server" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.800511 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="registry-server" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.800789 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="932e165e-5619-4e0b-b41c-849e402919ac" containerName="registry-server" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.802781 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.809270 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.893853 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwtn\" (UniqueName: \"kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.894349 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.894384 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.996727 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.997044 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.997138 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwtn\" (UniqueName: \"kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.997511 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:04 crc kubenswrapper[4810]: I1003 07:21:04.997525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.016820 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwtn\" (UniqueName: \"kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn\") pod \"community-operators-l4vm2\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.139812 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.620071 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:05 crc kubenswrapper[4810]: W1003 07:21:05.637324 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346fd53a_edc2_4e86_bdfa_0f4d232d9ebe.slice/crio-13586ae3d5dd6a152e916e7966464013a11db75153fba61d03288e2327612fbf WatchSource:0}: Error finding container 13586ae3d5dd6a152e916e7966464013a11db75153fba61d03288e2327612fbf: Status 404 returned error can't find the container with id 13586ae3d5dd6a152e916e7966464013a11db75153fba61d03288e2327612fbf Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.976358 4810 generic.go:334] "Generic (PLEG): container finished" podID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerID="4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72" exitCode=0 Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.976397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerDied","Data":"4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72"} Oct 03 07:21:05 crc kubenswrapper[4810]: I1003 07:21:05.976422 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerStarted","Data":"13586ae3d5dd6a152e916e7966464013a11db75153fba61d03288e2327612fbf"} Oct 03 07:21:06 crc kubenswrapper[4810]: I1003 07:21:06.987598 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerStarted","Data":"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0"} Oct 03 07:21:07 crc kubenswrapper[4810]: I1003 07:21:07.998404 4810 generic.go:334] "Generic (PLEG): container finished" podID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerID="06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0" exitCode=0 Oct 03 07:21:07 crc kubenswrapper[4810]: I1003 07:21:07.998496 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerDied","Data":"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0"} Oct 03 07:21:09 crc kubenswrapper[4810]: I1003 07:21:09.015092 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerStarted","Data":"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c"} Oct 03 07:21:09 crc kubenswrapper[4810]: I1003 07:21:09.034352 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4vm2" podStartSLOduration=2.4328033270000002 podStartE2EDuration="5.034331189s" podCreationTimestamp="2025-10-03 07:21:04 +0000 UTC" firstStartedPulling="2025-10-03 07:21:05.978235863 +0000 UTC m=+1499.405486598" lastFinishedPulling="2025-10-03 07:21:08.579763715 +0000 UTC m=+1502.007014460" observedRunningTime="2025-10-03 07:21:09.031325318 +0000 UTC m=+1502.458576073" watchObservedRunningTime="2025-10-03 07:21:09.034331189 +0000 UTC m=+1502.461581944" Oct 03 07:21:10 crc kubenswrapper[4810]: I1003 07:21:10.659541 4810 scope.go:117] "RemoveContainer" containerID="c65d312e3fef0962f5dd898ce43bc0b2227422ec0ef8fbac0355213702bdf2d9" Oct 03 07:21:10 crc kubenswrapper[4810]: I1003 07:21:10.692175 4810 scope.go:117] "RemoveContainer" containerID="7b392a0073cffbdd3820c8d247e7f2c85594c4c28bdec1e60745b838785fde2a" Oct 03 07:21:10 crc kubenswrapper[4810]: I1003 07:21:10.719245 4810 scope.go:117] "RemoveContainer" containerID="cf1efc6c98e7db501c387b01da937aadb58092bae8f0df2fc0d91ec5c743ef15" Oct 03 07:21:10 crc kubenswrapper[4810]: I1003 07:21:10.741252 4810 scope.go:117] "RemoveContainer" containerID="2b43b8e3fc4f0ec24d5e44665727bb914fd15db10c5890da4c565201c94e81b3" Oct 03 07:21:10 crc kubenswrapper[4810]: I1003 07:21:10.771734 4810 scope.go:117] "RemoveContainer" containerID="3ff73edca1345a254a3086f098fe5ad66a80530ab9ba5dd7d33d3e1dfe7eacf1" Oct 03 07:21:14 crc kubenswrapper[4810]: I1003 07:21:14.302827 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:21:14 crc kubenswrapper[4810]: E1003 07:21:14.303651 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:21:15 crc kubenswrapper[4810]: I1003 07:21:15.140300 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:15 crc kubenswrapper[4810]: I1003 07:21:15.140370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:15 crc kubenswrapper[4810]: I1003 07:21:15.200056 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:16 crc kubenswrapper[4810]: I1003 07:21:16.118583 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:16 crc kubenswrapper[4810]: I1003 07:21:16.169725 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:18 crc kubenswrapper[4810]: I1003 07:21:18.097675 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4vm2" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="registry-server" containerID="cri-o://6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c" gracePeriod=2 Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.027099 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.117703 4810 generic.go:334] "Generic (PLEG): container finished" podID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerID="6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c" exitCode=0 Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.117775 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerDied","Data":"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c"} Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.117794 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4vm2" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.117812 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4vm2" event={"ID":"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe","Type":"ContainerDied","Data":"13586ae3d5dd6a152e916e7966464013a11db75153fba61d03288e2327612fbf"} Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.117830 4810 scope.go:117] "RemoveContainer" containerID="6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.140367 4810 scope.go:117] "RemoveContainer" containerID="06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.167467 4810 scope.go:117] "RemoveContainer" containerID="4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.169181 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content\") pod \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.169281 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjwtn\" (UniqueName: \"kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn\") pod \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.169318 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities\") pod \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\" (UID: \"346fd53a-edc2-4e86-bdfa-0f4d232d9ebe\") " Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.170600 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities" (OuterVolumeSpecName: "utilities") pod "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" (UID: "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.179293 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn" (OuterVolumeSpecName: "kube-api-access-vjwtn") pod "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" (UID: "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe"). InnerVolumeSpecName "kube-api-access-vjwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.188218 4810 scope.go:117] "RemoveContainer" containerID="6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c" Oct 03 07:21:20 crc kubenswrapper[4810]: E1003 07:21:20.189016 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c\": container with ID starting with 6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c not found: ID does not exist" containerID="6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.189065 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c"} err="failed to get container status \"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c\": rpc error: code = NotFound desc = could not find container \"6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c\": container with ID starting with 6a1919f95a176668ddee6dc901433034c27ca8e86c52f82665ac5dc03e26bc8c not found: ID does not exist" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.189103 4810 scope.go:117] "RemoveContainer" containerID="06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0" Oct 03 07:21:20 crc kubenswrapper[4810]: E1003 07:21:20.189602 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0\": container with ID starting with 06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0 not found: ID does not exist" containerID="06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.189678 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0"} err="failed to get container status \"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0\": rpc error: code = NotFound desc = could not find container \"06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0\": container with ID starting with 06c0b9f2bc9a82931c640ff041a6f1a05b46674911b2fd030961ab8feaf7a8b0 not found: ID does not exist" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.189705 4810 scope.go:117] "RemoveContainer" containerID="4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72" Oct 03 07:21:20 crc kubenswrapper[4810]: E1003 07:21:20.190179 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72\": container with ID starting with 4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72 not found: ID does not exist" containerID="4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.190233 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72"} err="failed to get container status \"4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72\": rpc error: code = NotFound desc = could not find container \"4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72\": container with ID starting with 4eddf4305913b8caaa2b892d1297db08b7b08e8483363ac0b12ab7a5cd061b72 not found: ID does not exist" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.238208 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" (UID: "346fd53a-edc2-4e86-bdfa-0f4d232d9ebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.271156 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjwtn\" (UniqueName: \"kubernetes.io/projected/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-kube-api-access-vjwtn\") on node \"crc\" DevicePath \"\"" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.271786 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.271844 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.460157 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:20 crc kubenswrapper[4810]: I1003 07:21:20.478382 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4vm2"] Oct 03 07:21:21 crc kubenswrapper[4810]: I1003 07:21:21.316401 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" path="/var/lib/kubelet/pods/346fd53a-edc2-4e86-bdfa-0f4d232d9ebe/volumes" Oct 03 07:21:27 crc kubenswrapper[4810]: I1003 07:21:27.305313 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:21:27 crc kubenswrapper[4810]: E1003 07:21:27.306121 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:21:41 crc kubenswrapper[4810]: I1003 07:21:41.302618 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:21:41 crc kubenswrapper[4810]: E1003 07:21:41.303308 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:21:53 crc kubenswrapper[4810]: I1003 07:21:53.302626 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:21:53 crc kubenswrapper[4810]: E1003 07:21:53.304799 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:22:07 crc kubenswrapper[4810]: I1003 07:22:07.309028 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:22:07 crc kubenswrapper[4810]: E1003 07:22:07.309847 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.228266 4810 scope.go:117] "RemoveContainer" containerID="efbce464b43ea569647c2f211d82d262654767de06209c1d333f8956f0cfde74" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.261968 4810 scope.go:117] "RemoveContainer" containerID="2d48901a7cfa1029c525d00ad89dff9e258bc488c0aa6f201c065b594d0b5cd4" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.353625 4810 scope.go:117] "RemoveContainer" containerID="604fbe674101eea5cb7e0d47c285c60a0094065521648f16611c121922cc9a2f" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.380595 4810 scope.go:117] "RemoveContainer" containerID="19bc7021dff026f5ac16902ab90bf92bbb2ae86b78f15f02e6fa93154a4ef524" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.420466 4810 scope.go:117] "RemoveContainer" containerID="b5e2a6f32a511cfe5c2bcb58d96e9477a878fec9efb648f6d20cbea45fa1123b" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.468482 4810 scope.go:117] "RemoveContainer" containerID="3d014cea719ad2754563bf708ca5edd4cb57cd05cdc5d9f1b0fe47be4fe251f0" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.521428 4810 scope.go:117] "RemoveContainer" containerID="a03e4c85b6ff7858d2a856ad993d26b4ff37bf9230182e4686673134b00ae28d" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.558983 4810 scope.go:117] "RemoveContainer" containerID="ba8c8fe21e1637c7d147097344118435d1e9acd34eb2f690dcbaedd538f93903" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.581514 4810 scope.go:117] "RemoveContainer" containerID="456574b4e24c97ad994ceb383265e419a3c7ecdc16cc8a3733edb174e95a36ad" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.614914 4810 scope.go:117] "RemoveContainer" containerID="f56a3b36c420c607e23d9871343e804d3def38c7f9f58143a8bb5e35ed03e6e3" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.633096 4810 scope.go:117] "RemoveContainer" containerID="a912efb34fb2f84872b11c81fd5c95edc0960044fe3ae8e5a4a1b417cd38a487" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.655311 4810 scope.go:117] "RemoveContainer" containerID="da7ab208b7a7f6acd54276737cbd12b568557be50e4c4853ebfbdce1cc820332" Oct 03 07:22:11 crc kubenswrapper[4810]: I1003 07:22:11.674621 4810 scope.go:117] "RemoveContainer" containerID="ff727ce345769006de0cc33823d09d7e241aca4424130baa33108f866e5cb72c" Oct 03 07:22:19 crc kubenswrapper[4810]: I1003 07:22:19.303139 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:22:19 crc kubenswrapper[4810]: E1003 07:22:19.303991 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:22:32 crc kubenswrapper[4810]: I1003 07:22:32.303191 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:22:32 crc kubenswrapper[4810]: E1003 07:22:32.304250 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:22:44 crc kubenswrapper[4810]: I1003 07:22:44.301982 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:22:44 crc kubenswrapper[4810]: E1003 07:22:44.303396 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:22:57 crc kubenswrapper[4810]: I1003 07:22:57.310998 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:22:57 crc kubenswrapper[4810]: E1003 07:22:57.311845 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:23:11 crc kubenswrapper[4810]: I1003 07:23:11.977337 4810 scope.go:117] "RemoveContainer" containerID="d9d3ff248266b41b739b029c6e20e9253b3a96fceeb907d9681d132dbed6d665" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.021066 4810 scope.go:117] "RemoveContainer" containerID="4edd2c79f8fe2026cb6b52aabaff3103779a04f7415a9f5bd4772503c6dbdd01" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.044088 4810 scope.go:117] "RemoveContainer" containerID="47d80e02d12804f4a96f6f83f5614932491036bb359019c9891d07577398c937" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.085647 4810 scope.go:117] "RemoveContainer" containerID="c466dfa6a59d96e0d7faf89c262bdc91ffef8e54dbc15010039c1aa7f1e0a15a" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.104617 4810 scope.go:117] "RemoveContainer" containerID="a7bf0333a5fea00209ffe7878043268796bb6130c3b253a0030e2a551d03432e" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.123107 4810 scope.go:117] "RemoveContainer" containerID="1f17eff80bda95e4a3a828da788b685a51858f514dff3cf24b3bd09f553d4640" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.140275 4810 scope.go:117] "RemoveContainer" containerID="db3e2a840110e13b015cb2f3ec8f381cda6865452f991a0abd017a3f0c638437" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.164959 4810 scope.go:117] "RemoveContainer" containerID="9aed72a1383db33d14bca460e68cc2d23b09efa0d6304ea25bf268fbd94fc76f" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.209296 4810 scope.go:117] "RemoveContainer" containerID="faf3614757fd2717eb1a45d712084253daff94f38893df384ccb4ba891e7c533" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.225216 4810 scope.go:117] "RemoveContainer" containerID="22a40c14e070770d0a321f3a95675ac6ccfbac50ee1ca196e1e371d5f2b757f2" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.248575 4810 scope.go:117] "RemoveContainer" containerID="9a80e38f2f7cb35c6df40631d07f6fa9af9ccb09c4a52542e08ea4afc6fc13d5" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.286478 4810 scope.go:117] "RemoveContainer" containerID="6d769ed9f0cf1f6249f4f26aa6c0432b45547792f38f58ec8b3c51b5293b812d" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.302547 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:23:12 crc kubenswrapper[4810]: E1003 07:23:12.302859 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.322762 4810 scope.go:117] "RemoveContainer" containerID="116ca4730fd85c0b04786638fed840cf4e9ac1f674f214b442febeb58db8324f" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.346618 4810 scope.go:117] "RemoveContainer" containerID="c75eea83419638c2ba3c4d7f63f3fef697e22f91ee2cf79cf0191a60c2616f4c" Oct 03 07:23:12 crc kubenswrapper[4810]: I1003 07:23:12.386193 4810 scope.go:117] "RemoveContainer" containerID="17d9144dd433472f1a4fb68c490e1d20c3e7b4f654b68c31571cb1ea1e6b851c" Oct 03 07:23:27 crc kubenswrapper[4810]: I1003 07:23:27.312139 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:23:27 crc kubenswrapper[4810]: E1003 07:23:27.313414 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:23:42 crc kubenswrapper[4810]: I1003 07:23:42.302493 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:23:42 crc kubenswrapper[4810]: E1003 07:23:42.303807 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:23:54 crc kubenswrapper[4810]: I1003 07:23:54.303047 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:23:54 crc kubenswrapper[4810]: E1003 07:23:54.304295 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:24:07 crc kubenswrapper[4810]: I1003 07:24:07.307094 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:24:07 crc kubenswrapper[4810]: E1003 07:24:07.308085 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.656056 4810 scope.go:117] "RemoveContainer" containerID="9d7aec1813e16ffaf6566acf654970929ac9f6bc63b472068381f3dc2dfc92e1" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.707370 4810 scope.go:117] "RemoveContainer" containerID="0247a4056b29a103d07032ed7a3c453d738bdaa7868ee5953410287fdcac3220" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.730992 4810 scope.go:117] "RemoveContainer" containerID="3f01201fc73b718d0aea9571880ce1f2a30e6b5a5955dcbaa7b43e5aedc4fe65" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.748666 4810 scope.go:117] "RemoveContainer" containerID="b1d5a09f50b97a0b1a6043433d3b498884f9e351468f8bfcf55a3f1d35912ab0" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.795884 4810 scope.go:117] "RemoveContainer" containerID="d1f756ac00c5c1c73d319dfc9ee7cf96f32a3e5421d41971071b05ee6e0f4c9e" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.822631 4810 scope.go:117] "RemoveContainer" containerID="a5552a4ebe7248adad1514badc14a89d193f6d57a51f267e35d82e36a87c0a3c" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.844140 4810 scope.go:117] "RemoveContainer" containerID="bb77897c0ed3781276a2a824ff0d9bfd38c627747b91d61f75b3403cb0b2103a" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.870499 4810 scope.go:117] "RemoveContainer" containerID="a949b96df00f2d763ee4f949894529763e305ba03a38a90d148cf3ea09c09c2e" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.889949 4810 scope.go:117] "RemoveContainer" containerID="c247d711c2297e4a2d0cc422533a2100aac1e02fae1217c0313bfef19b61465c" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.914169 4810 scope.go:117] "RemoveContainer" containerID="3216b2c3caf454d8a6c7df9a162014f7925ab9d4e0f9fb8999f1009a4df3da6f" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.936107 4810 scope.go:117] "RemoveContainer" containerID="61317c73aecc47dae6a6843a4447305cfab5916eb86c694594f67fbaea6a141f" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.953872 4810 scope.go:117] "RemoveContainer" containerID="48e3bcb0a0eff2f23d7e18286ed77d99da1068ce70fea65ac4596f9737bd9bb9" Oct 03 07:24:12 crc kubenswrapper[4810]: I1003 07:24:12.976585 4810 scope.go:117] "RemoveContainer" containerID="65b37b6e4a903e3fb2e71f4f76ac62abb4ac05735afaec0a7de87bdfc314c2c0" Oct 03 07:24:19 crc kubenswrapper[4810]: I1003 07:24:19.303468 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:24:19 crc kubenswrapper[4810]: E1003 07:24:19.304325 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:24:31 crc kubenswrapper[4810]: I1003 07:24:31.302581 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:24:31 crc kubenswrapper[4810]: E1003 07:24:31.303406 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:24:43 crc kubenswrapper[4810]: I1003 07:24:43.303262 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:24:43 crc kubenswrapper[4810]: E1003 07:24:43.304323 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:24:54 crc kubenswrapper[4810]: I1003 07:24:54.302398 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:24:54 crc kubenswrapper[4810]: E1003 07:24:54.303352 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:25:07 crc kubenswrapper[4810]: I1003 07:25:07.307795 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:25:07 crc kubenswrapper[4810]: E1003 07:25:07.308608 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.144529 4810 scope.go:117] "RemoveContainer" containerID="6ae3cb61504efe6139088cb930100070bf14667b9b9ded1ec426e0e440d762ab" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.167811 4810 scope.go:117] "RemoveContainer" containerID="c7d1d770ee99b774d99f0d4f9d1665eb327b3f5064b186c996508f4ca369c666" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.190751 4810 scope.go:117] "RemoveContainer" containerID="5967e80a1430b2f097bc5422a0203827666c884ad43908ec20853a87042ca1d9" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.233200 4810 scope.go:117] "RemoveContainer" containerID="af47ab8048ae97300d367ea13bca5094cd2d5e8c4bb3543b179640fe548d3c80" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.277689 4810 scope.go:117] "RemoveContainer" containerID="c08aeca318c51ae02390085c7ab9619670b1e308ec26d5e501640e18baa3cc63" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.321453 4810 scope.go:117] "RemoveContainer" containerID="daa5d2f72530ea6f244cede184ff0977d9ed32815ea3a1d539a63477573c3744" Oct 03 07:25:13 crc kubenswrapper[4810]: I1003 07:25:13.350651 4810 scope.go:117] "RemoveContainer" containerID="a45f333c001d7ab511200f81f6fa6f80c1ad4e082222f60409447082f87c2d1e" Oct 03 07:25:19 crc kubenswrapper[4810]: I1003 07:25:19.302352 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:25:19 crc kubenswrapper[4810]: E1003 07:25:19.303106 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:25:31 crc kubenswrapper[4810]: I1003 07:25:31.302411 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:25:31 crc kubenswrapper[4810]: E1003 07:25:31.304134 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:25:45 crc kubenswrapper[4810]: I1003 07:25:45.303431 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:25:45 crc kubenswrapper[4810]: E1003 07:25:45.304352 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:26:00 crc kubenswrapper[4810]: I1003 07:26:00.302831 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:26:00 crc kubenswrapper[4810]: E1003 07:26:00.304211 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:26:13 crc kubenswrapper[4810]: I1003 07:26:13.302602 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:26:13 crc kubenswrapper[4810]: I1003 07:26:13.461710 4810 scope.go:117] "RemoveContainer" containerID="95b505fefa5eb2055f79509fe6a51262d5e6c2ddec8bdf8c70e3b8cacb9557c6" Oct 03 07:26:13 crc kubenswrapper[4810]: I1003 07:26:13.483114 4810 scope.go:117] "RemoveContainer" containerID="a08158729ce0fad165b64b76d5d8b70aee5356d571f7f8136158efbbc91849ec" Oct 03 07:26:13 crc kubenswrapper[4810]: I1003 07:26:13.499094 4810 scope.go:117] "RemoveContainer" containerID="3f1514c03f3de86269cf910c6642b971ca929d24ed8843f73f5cd456ca222515" Oct 03 07:26:14 crc kubenswrapper[4810]: I1003 07:26:14.818027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd"} Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.038947 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:43 crc kubenswrapper[4810]: E1003 07:27:43.041263 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="registry-server" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.041376 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="registry-server" Oct 03 07:27:43 crc kubenswrapper[4810]: E1003 07:27:43.041466 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="extract-content" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.041542 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="extract-content" Oct 03 07:27:43 crc kubenswrapper[4810]: E1003 07:27:43.041622 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="extract-utilities" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.041704 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="extract-utilities" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.041974 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="346fd53a-edc2-4e86-bdfa-0f4d232d9ebe" containerName="registry-server" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.043335 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.064336 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.171637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.172069 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.172279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlrgb\" (UniqueName: \"kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.273070 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.273109 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.273179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlrgb\" (UniqueName: \"kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.273787 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.274011 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.315741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlrgb\" (UniqueName: \"kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb\") pod \"redhat-marketplace-ksb5w\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.373816 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:43 crc kubenswrapper[4810]: I1003 07:27:43.869244 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:44 crc kubenswrapper[4810]: I1003 07:27:44.580346 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerID="8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9" exitCode=0 Oct 03 07:27:44 crc kubenswrapper[4810]: I1003 07:27:44.580379 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerDied","Data":"8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9"} Oct 03 07:27:44 crc kubenswrapper[4810]: I1003 07:27:44.580402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerStarted","Data":"51e95e7e9c0b7ad60553ee7e057ec3958112e3b45cb3276c40802ff3646dbdc2"} Oct 03 07:27:44 crc kubenswrapper[4810]: I1003 07:27:44.585265 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:27:45 crc kubenswrapper[4810]: I1003 07:27:45.592062 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerID="1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d" exitCode=0 Oct 03 07:27:45 crc kubenswrapper[4810]: I1003 07:27:45.592157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerDied","Data":"1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d"} Oct 03 07:27:46 crc kubenswrapper[4810]: I1003 07:27:46.607160 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerStarted","Data":"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b"} Oct 03 07:27:46 crc kubenswrapper[4810]: I1003 07:27:46.642137 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ksb5w" podStartSLOduration=2.171924109 podStartE2EDuration="3.642105695s" podCreationTimestamp="2025-10-03 07:27:43 +0000 UTC" firstStartedPulling="2025-10-03 07:27:44.584397425 +0000 UTC m=+1898.011648180" lastFinishedPulling="2025-10-03 07:27:46.054579021 +0000 UTC m=+1899.481829766" observedRunningTime="2025-10-03 07:27:46.635559508 +0000 UTC m=+1900.062810273" watchObservedRunningTime="2025-10-03 07:27:46.642105695 +0000 UTC m=+1900.069356470" Oct 03 07:27:53 crc kubenswrapper[4810]: I1003 07:27:53.374501 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:53 crc kubenswrapper[4810]: I1003 07:27:53.376513 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:53 crc kubenswrapper[4810]: I1003 07:27:53.437873 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:53 crc kubenswrapper[4810]: I1003 07:27:53.738101 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:54 crc kubenswrapper[4810]: I1003 07:27:54.225857 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:55 crc kubenswrapper[4810]: I1003 07:27:55.683384 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ksb5w" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="registry-server" containerID="cri-o://62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b" gracePeriod=2 Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.132119 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.181870 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlrgb\" (UniqueName: \"kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb\") pod \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.181998 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content\") pod \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.182160 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities\") pod \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\" (UID: \"7f8af941-a9d8-4e41-ab83-c3eb3372058e\") " Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.183217 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities" (OuterVolumeSpecName: "utilities") pod "7f8af941-a9d8-4e41-ab83-c3eb3372058e" (UID: "7f8af941-a9d8-4e41-ab83-c3eb3372058e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.191401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb" (OuterVolumeSpecName: "kube-api-access-xlrgb") pod "7f8af941-a9d8-4e41-ab83-c3eb3372058e" (UID: "7f8af941-a9d8-4e41-ab83-c3eb3372058e"). InnerVolumeSpecName "kube-api-access-xlrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.212956 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f8af941-a9d8-4e41-ab83-c3eb3372058e" (UID: "7f8af941-a9d8-4e41-ab83-c3eb3372058e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.283797 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlrgb\" (UniqueName: \"kubernetes.io/projected/7f8af941-a9d8-4e41-ab83-c3eb3372058e-kube-api-access-xlrgb\") on node \"crc\" DevicePath \"\"" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.283842 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.283855 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f8af941-a9d8-4e41-ab83-c3eb3372058e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.700002 4810 generic.go:334] "Generic (PLEG): container finished" podID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerID="62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b" exitCode=0 Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.700065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerDied","Data":"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b"} Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.700106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksb5w" event={"ID":"7f8af941-a9d8-4e41-ab83-c3eb3372058e","Type":"ContainerDied","Data":"51e95e7e9c0b7ad60553ee7e057ec3958112e3b45cb3276c40802ff3646dbdc2"} Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.700141 4810 scope.go:117] "RemoveContainer" containerID="62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.700330 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksb5w" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.734999 4810 scope.go:117] "RemoveContainer" containerID="1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.754768 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.764169 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksb5w"] Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.782717 4810 scope.go:117] "RemoveContainer" containerID="8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.820192 4810 scope.go:117] "RemoveContainer" containerID="62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b" Oct 03 07:27:56 crc kubenswrapper[4810]: E1003 07:27:56.821255 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b\": container with ID starting with 62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b not found: ID does not exist" containerID="62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.821464 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b"} err="failed to get container status \"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b\": rpc error: code = NotFound desc = could not find container \"62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b\": container with ID starting with 62cb065bbc642d9c8f30d4bfef2f997d312420b7fd6637e7e4f0ab95e8a7017b not found: ID does not exist" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.821509 4810 scope.go:117] "RemoveContainer" containerID="1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d" Oct 03 07:27:56 crc kubenswrapper[4810]: E1003 07:27:56.822227 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d\": container with ID starting with 1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d not found: ID does not exist" containerID="1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.822299 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d"} err="failed to get container status \"1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d\": rpc error: code = NotFound desc = could not find container \"1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d\": container with ID starting with 1fbee48d99f2b4438fb30b764762ec4f080f5e0083330f89f5114d009686ab8d not found: ID does not exist" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.822342 4810 scope.go:117] "RemoveContainer" containerID="8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9" Oct 03 07:27:56 crc kubenswrapper[4810]: E1003 07:27:56.822870 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9\": container with ID starting with 8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9 not found: ID does not exist" containerID="8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9" Oct 03 07:27:56 crc kubenswrapper[4810]: I1003 07:27:56.822981 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9"} err="failed to get container status \"8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9\": rpc error: code = NotFound desc = could not find container \"8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9\": container with ID starting with 8093b3ac5e323fa8d82373af8896e7e58c32f7a566b1b8005a9d93a720c30ec9 not found: ID does not exist" Oct 03 07:27:57 crc kubenswrapper[4810]: I1003 07:27:57.325400 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" path="/var/lib/kubelet/pods/7f8af941-a9d8-4e41-ab83-c3eb3372058e/volumes" Oct 03 07:28:32 crc kubenswrapper[4810]: I1003 07:28:32.089239 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:28:32 crc kubenswrapper[4810]: I1003 07:28:32.090051 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:29:02 crc kubenswrapper[4810]: I1003 07:29:02.089075 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:29:02 crc kubenswrapper[4810]: I1003 07:29:02.089834 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.062867 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:29 crc kubenswrapper[4810]: E1003 07:29:29.063947 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="extract-content" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.063971 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="extract-content" Oct 03 07:29:29 crc kubenswrapper[4810]: E1003 07:29:29.064009 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="extract-utilities" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.064020 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="extract-utilities" Oct 03 07:29:29 crc kubenswrapper[4810]: E1003 07:29:29.064034 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="registry-server" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.064048 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="registry-server" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.064261 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8af941-a9d8-4e41-ab83-c3eb3372058e" containerName="registry-server" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.066612 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.078942 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.232337 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxt7c\" (UniqueName: \"kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.232417 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.232460 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.333509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.333577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxt7c\" (UniqueName: \"kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.333629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.334054 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.334280 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.350827 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxt7c\" (UniqueName: \"kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c\") pod \"redhat-operators-8lns7\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.390810 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:29 crc kubenswrapper[4810]: I1003 07:29:29.838194 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:30 crc kubenswrapper[4810]: I1003 07:29:30.542446 4810 generic.go:334] "Generic (PLEG): container finished" podID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerID="54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab" exitCode=0 Oct 03 07:29:30 crc kubenswrapper[4810]: I1003 07:29:30.542547 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerDied","Data":"54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab"} Oct 03 07:29:30 crc kubenswrapper[4810]: I1003 07:29:30.542827 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerStarted","Data":"b3e7cf1d592ee05806116e4053af3fff6c45f86b2df7282638f1b18d28e8b0c7"} Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.089271 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.089880 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.089939 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.090510 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.090554 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd" gracePeriod=600 Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.560081 4810 generic.go:334] "Generic (PLEG): container finished" podID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerID="f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e" exitCode=0 Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.560464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerDied","Data":"f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e"} Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.565787 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd" exitCode=0 Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.565832 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd"} Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.565867 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f"} Oct 03 07:29:32 crc kubenswrapper[4810]: I1003 07:29:32.565915 4810 scope.go:117] "RemoveContainer" containerID="4baea20e5180bf001e262de27e96d5f74fc904856228924ebd1feb65235d5b66" Oct 03 07:29:33 crc kubenswrapper[4810]: I1003 07:29:33.584760 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerStarted","Data":"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e"} Oct 03 07:29:33 crc kubenswrapper[4810]: I1003 07:29:33.614782 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lns7" podStartSLOduration=2.058474117 podStartE2EDuration="4.614765418s" podCreationTimestamp="2025-10-03 07:29:29 +0000 UTC" firstStartedPulling="2025-10-03 07:29:30.545105698 +0000 UTC m=+2003.972356473" lastFinishedPulling="2025-10-03 07:29:33.101397029 +0000 UTC m=+2006.528647774" observedRunningTime="2025-10-03 07:29:33.61076456 +0000 UTC m=+2007.038015375" watchObservedRunningTime="2025-10-03 07:29:33.614765418 +0000 UTC m=+2007.042016163" Oct 03 07:29:39 crc kubenswrapper[4810]: I1003 07:29:39.391790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:39 crc kubenswrapper[4810]: I1003 07:29:39.392649 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:39 crc kubenswrapper[4810]: I1003 07:29:39.468583 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:39 crc kubenswrapper[4810]: I1003 07:29:39.694827 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:39 crc kubenswrapper[4810]: I1003 07:29:39.767948 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:41 crc kubenswrapper[4810]: I1003 07:29:41.651965 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lns7" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="registry-server" containerID="cri-o://0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e" gracePeriod=2 Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.084653 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.245524 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities\") pod \"afaa40b0-e88f-4825-965a-c463b0b73ee5\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.245576 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxt7c\" (UniqueName: \"kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c\") pod \"afaa40b0-e88f-4825-965a-c463b0b73ee5\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.245689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content\") pod \"afaa40b0-e88f-4825-965a-c463b0b73ee5\" (UID: \"afaa40b0-e88f-4825-965a-c463b0b73ee5\") " Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.246555 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities" (OuterVolumeSpecName: "utilities") pod "afaa40b0-e88f-4825-965a-c463b0b73ee5" (UID: "afaa40b0-e88f-4825-965a-c463b0b73ee5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.252347 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c" (OuterVolumeSpecName: "kube-api-access-nxt7c") pod "afaa40b0-e88f-4825-965a-c463b0b73ee5" (UID: "afaa40b0-e88f-4825-965a-c463b0b73ee5"). InnerVolumeSpecName "kube-api-access-nxt7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.341117 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afaa40b0-e88f-4825-965a-c463b0b73ee5" (UID: "afaa40b0-e88f-4825-965a-c463b0b73ee5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.347191 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxt7c\" (UniqueName: \"kubernetes.io/projected/afaa40b0-e88f-4825-965a-c463b0b73ee5-kube-api-access-nxt7c\") on node \"crc\" DevicePath \"\"" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.347226 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.347240 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afaa40b0-e88f-4825-965a-c463b0b73ee5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.661383 4810 generic.go:334] "Generic (PLEG): container finished" podID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerID="0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e" exitCode=0 Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.661427 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerDied","Data":"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e"} Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.661448 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lns7" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.661473 4810 scope.go:117] "RemoveContainer" containerID="0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.661459 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lns7" event={"ID":"afaa40b0-e88f-4825-965a-c463b0b73ee5","Type":"ContainerDied","Data":"b3e7cf1d592ee05806116e4053af3fff6c45f86b2df7282638f1b18d28e8b0c7"} Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.695047 4810 scope.go:117] "RemoveContainer" containerID="f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.707405 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.713254 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lns7"] Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.725537 4810 scope.go:117] "RemoveContainer" containerID="54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.747456 4810 scope.go:117] "RemoveContainer" containerID="0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e" Oct 03 07:29:42 crc kubenswrapper[4810]: E1003 07:29:42.747966 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e\": container with ID starting with 0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e not found: ID does not exist" containerID="0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.748003 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e"} err="failed to get container status \"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e\": rpc error: code = NotFound desc = could not find container \"0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e\": container with ID starting with 0f727101842f8b68deac74b6b4463f256413e2f1a34e1cfc4bf1a134fa735a4e not found: ID does not exist" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.748023 4810 scope.go:117] "RemoveContainer" containerID="f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e" Oct 03 07:29:42 crc kubenswrapper[4810]: E1003 07:29:42.748218 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e\": container with ID starting with f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e not found: ID does not exist" containerID="f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.748243 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e"} err="failed to get container status \"f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e\": rpc error: code = NotFound desc = could not find container \"f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e\": container with ID starting with f26596496b1f50832c654e337bdea114b53bdb47e89ee49f5209956a657d670e not found: ID does not exist" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.748262 4810 scope.go:117] "RemoveContainer" containerID="54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab" Oct 03 07:29:42 crc kubenswrapper[4810]: E1003 07:29:42.748473 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab\": container with ID starting with 54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab not found: ID does not exist" containerID="54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab" Oct 03 07:29:42 crc kubenswrapper[4810]: I1003 07:29:42.748503 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab"} err="failed to get container status \"54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab\": rpc error: code = NotFound desc = could not find container \"54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab\": container with ID starting with 54015162e84b179d7ddcc66f922d6b69f3d4ea146629b0ee114fc1aa41eba1ab not found: ID does not exist" Oct 03 07:29:43 crc kubenswrapper[4810]: I1003 07:29:43.323658 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" path="/var/lib/kubelet/pods/afaa40b0-e88f-4825-965a-c463b0b73ee5/volumes" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.165677 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7"] Oct 03 07:30:00 crc kubenswrapper[4810]: E1003 07:30:00.166723 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="extract-content" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.166743 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="extract-content" Oct 03 07:30:00 crc kubenswrapper[4810]: E1003 07:30:00.166763 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="extract-utilities" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.166771 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="extract-utilities" Oct 03 07:30:00 crc kubenswrapper[4810]: E1003 07:30:00.166785 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="registry-server" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.166793 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="registry-server" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.167119 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="afaa40b0-e88f-4825-965a-c463b0b73ee5" containerName="registry-server" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.167719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.174060 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.174948 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.179208 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7"] Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.335968 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwc5j\" (UniqueName: \"kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.336020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.336150 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.438057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwc5j\" (UniqueName: \"kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.438135 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.438498 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.440343 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.448741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.473786 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwc5j\" (UniqueName: \"kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j\") pod \"collect-profiles-29324610-jqth7\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.499403 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.767565 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7"] Oct 03 07:30:00 crc kubenswrapper[4810]: I1003 07:30:00.824191 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" event={"ID":"d4ea9ecb-a1be-4648-84c8-4447f16fba29","Type":"ContainerStarted","Data":"28ba807fc5e95b682ea190bab6ba5448aea58d060cf876e7413301f3c6b63f73"} Oct 03 07:30:01 crc kubenswrapper[4810]: I1003 07:30:01.837102 4810 generic.go:334] "Generic (PLEG): container finished" podID="d4ea9ecb-a1be-4648-84c8-4447f16fba29" containerID="4b0b2387031e5171b1f3fa7f91f2122286ae9047ccad19cffef3595dfe15a459" exitCode=0 Oct 03 07:30:01 crc kubenswrapper[4810]: I1003 07:30:01.837183 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" event={"ID":"d4ea9ecb-a1be-4648-84c8-4447f16fba29","Type":"ContainerDied","Data":"4b0b2387031e5171b1f3fa7f91f2122286ae9047ccad19cffef3595dfe15a459"} Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.221051 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.389506 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume\") pod \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.389567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume\") pod \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.389713 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwc5j\" (UniqueName: \"kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j\") pod \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\" (UID: \"d4ea9ecb-a1be-4648-84c8-4447f16fba29\") " Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.390968 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume" (OuterVolumeSpecName: "config-volume") pod "d4ea9ecb-a1be-4648-84c8-4447f16fba29" (UID: "d4ea9ecb-a1be-4648-84c8-4447f16fba29"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.395415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j" (OuterVolumeSpecName: "kube-api-access-bwc5j") pod "d4ea9ecb-a1be-4648-84c8-4447f16fba29" (UID: "d4ea9ecb-a1be-4648-84c8-4447f16fba29"). InnerVolumeSpecName "kube-api-access-bwc5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.396340 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d4ea9ecb-a1be-4648-84c8-4447f16fba29" (UID: "d4ea9ecb-a1be-4648-84c8-4447f16fba29"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.491537 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4ea9ecb-a1be-4648-84c8-4447f16fba29-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.491582 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d4ea9ecb-a1be-4648-84c8-4447f16fba29-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.491596 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwc5j\" (UniqueName: \"kubernetes.io/projected/d4ea9ecb-a1be-4648-84c8-4447f16fba29-kube-api-access-bwc5j\") on node \"crc\" DevicePath \"\"" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.859029 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" event={"ID":"d4ea9ecb-a1be-4648-84c8-4447f16fba29","Type":"ContainerDied","Data":"28ba807fc5e95b682ea190bab6ba5448aea58d060cf876e7413301f3c6b63f73"} Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.859075 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28ba807fc5e95b682ea190bab6ba5448aea58d060cf876e7413301f3c6b63f73" Oct 03 07:30:03 crc kubenswrapper[4810]: I1003 07:30:03.859616 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7" Oct 03 07:30:04 crc kubenswrapper[4810]: I1003 07:30:04.306301 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz"] Oct 03 07:30:04 crc kubenswrapper[4810]: I1003 07:30:04.312731 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324565-s84mz"] Oct 03 07:30:05 crc kubenswrapper[4810]: I1003 07:30:05.311816 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a" path="/var/lib/kubelet/pods/8dfd0eb8-be4c-4ec1-9132-f1be43e6f03a/volumes" Oct 03 07:30:13 crc kubenswrapper[4810]: I1003 07:30:13.627014 4810 scope.go:117] "RemoveContainer" containerID="7aab4ae837560cb6087d7b83ac07a2cee7a88c4fe20105d09db697be89895c6e" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.419283 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 07:31:19 crc kubenswrapper[4810]: E1003 07:31:19.420302 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ea9ecb-a1be-4648-84c8-4447f16fba29" containerName="collect-profiles" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.420321 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ea9ecb-a1be-4648-84c8-4447f16fba29" containerName="collect-profiles" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.420550 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ea9ecb-a1be-4648-84c8-4447f16fba29" containerName="collect-profiles" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.422393 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.445448 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.503209 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.503430 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.503478 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7b2p\" (UniqueName: \"kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.604542 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.604598 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7b2p\" (UniqueName: \"kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.604665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.605303 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.605453 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.634802 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7b2p\" (UniqueName: \"kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p\") pod \"community-operators-k67nt\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:19 crc kubenswrapper[4810]: I1003 07:31:19.792941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:20 crc kubenswrapper[4810]: I1003 07:31:20.257677 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 07:31:20 crc kubenswrapper[4810]: I1003 07:31:20.623344 4810 generic.go:334] "Generic (PLEG): container finished" podID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerID="e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6" exitCode=0 Oct 03 07:31:20 crc kubenswrapper[4810]: I1003 07:31:20.623402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerDied","Data":"e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6"} Oct 03 07:31:20 crc kubenswrapper[4810]: I1003 07:31:20.623826 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerStarted","Data":"8c6971ea82f27a55122ea62062765d7d56f305bf07dd87666694e163491276ad"} Oct 03 07:31:24 crc kubenswrapper[4810]: I1003 07:31:24.653871 4810 generic.go:334] "Generic (PLEG): container finished" podID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerID="d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650" exitCode=0 Oct 03 07:31:24 crc kubenswrapper[4810]: I1003 07:31:24.653941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerDied","Data":"d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650"} Oct 03 07:31:25 crc kubenswrapper[4810]: I1003 07:31:25.667004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerStarted","Data":"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3"} Oct 03 07:31:25 crc kubenswrapper[4810]: I1003 07:31:25.708813 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k67nt" podStartSLOduration=2.245594351 podStartE2EDuration="6.708785546s" podCreationTimestamp="2025-10-03 07:31:19 +0000 UTC" firstStartedPulling="2025-10-03 07:31:20.625206713 +0000 UTC m=+2114.052457458" lastFinishedPulling="2025-10-03 07:31:25.088397908 +0000 UTC m=+2118.515648653" observedRunningTime="2025-10-03 07:31:25.697707888 +0000 UTC m=+2119.124958703" watchObservedRunningTime="2025-10-03 07:31:25.708785546 +0000 UTC m=+2119.136036291" Oct 03 07:31:29 crc kubenswrapper[4810]: I1003 07:31:29.793718 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:29 crc kubenswrapper[4810]: I1003 07:31:29.794254 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:29 crc kubenswrapper[4810]: I1003 07:31:29.874188 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:30 crc kubenswrapper[4810]: I1003 07:31:30.804983 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k67nt" Oct 03 07:31:30 crc kubenswrapper[4810]: I1003 07:31:30.916709 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 07:31:30 crc kubenswrapper[4810]: I1003 07:31:30.999275 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 07:31:30 crc kubenswrapper[4810]: I1003 07:31:30.999647 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s8hjs" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="registry-server" containerID="cri-o://96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148" gracePeriod=2 Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.418479 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.586830 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content\") pod \"a48cb014-8481-4497-9047-8c319809f4cc\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.586933 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities\") pod \"a48cb014-8481-4497-9047-8c319809f4cc\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.587035 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6dfp\" (UniqueName: \"kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp\") pod \"a48cb014-8481-4497-9047-8c319809f4cc\" (UID: \"a48cb014-8481-4497-9047-8c319809f4cc\") " Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.587603 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities" (OuterVolumeSpecName: "utilities") pod "a48cb014-8481-4497-9047-8c319809f4cc" (UID: "a48cb014-8481-4497-9047-8c319809f4cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.593053 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp" (OuterVolumeSpecName: "kube-api-access-c6dfp") pod "a48cb014-8481-4497-9047-8c319809f4cc" (UID: "a48cb014-8481-4497-9047-8c319809f4cc"). InnerVolumeSpecName "kube-api-access-c6dfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.628941 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a48cb014-8481-4497-9047-8c319809f4cc" (UID: "a48cb014-8481-4497-9047-8c319809f4cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.688667 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.688702 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48cb014-8481-4497-9047-8c319809f4cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.688716 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6dfp\" (UniqueName: \"kubernetes.io/projected/a48cb014-8481-4497-9047-8c319809f4cc-kube-api-access-c6dfp\") on node \"crc\" DevicePath \"\"" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.734484 4810 generic.go:334] "Generic (PLEG): container finished" podID="a48cb014-8481-4497-9047-8c319809f4cc" containerID="96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148" exitCode=0 Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.735262 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s8hjs" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.735313 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerDied","Data":"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148"} Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.735346 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s8hjs" event={"ID":"a48cb014-8481-4497-9047-8c319809f4cc","Type":"ContainerDied","Data":"afd31301af32390b2e12643163aeb1846a5c205df7491b1d3122924166be8d4e"} Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.735366 4810 scope.go:117] "RemoveContainer" containerID="96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.766227 4810 scope.go:117] "RemoveContainer" containerID="2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.769051 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.780184 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s8hjs"] Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.791430 4810 scope.go:117] "RemoveContainer" containerID="5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.817319 4810 scope.go:117] "RemoveContainer" containerID="96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148" Oct 03 07:31:31 crc kubenswrapper[4810]: E1003 07:31:31.820328 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148\": container with ID starting with 96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148 not found: ID does not exist" containerID="96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.820369 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148"} err="failed to get container status \"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148\": rpc error: code = NotFound desc = could not find container \"96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148\": container with ID starting with 96eeb22347adcf6120435a6ae1993f7bd7092a4c9e4b4a85f04bfc99ac7bb148 not found: ID does not exist" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.820393 4810 scope.go:117] "RemoveContainer" containerID="2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2" Oct 03 07:31:31 crc kubenswrapper[4810]: E1003 07:31:31.820858 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2\": container with ID starting with 2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2 not found: ID does not exist" containerID="2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.820887 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2"} err="failed to get container status \"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2\": rpc error: code = NotFound desc = could not find container \"2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2\": container with ID starting with 2a479ffdc1efd64ee2b6a906fabd043cae2a837ac09d66c84f2793a970fba0c2 not found: ID does not exist" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.820916 4810 scope.go:117] "RemoveContainer" containerID="5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735" Oct 03 07:31:31 crc kubenswrapper[4810]: E1003 07:31:31.821111 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735\": container with ID starting with 5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735 not found: ID does not exist" containerID="5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735" Oct 03 07:31:31 crc kubenswrapper[4810]: I1003 07:31:31.821133 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735"} err="failed to get container status \"5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735\": rpc error: code = NotFound desc = could not find container \"5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735\": container with ID starting with 5bca463075674ff86564a4f2b2d33ffff9c1b95e005e39ac6ce519f2acb4a735 not found: ID does not exist" Oct 03 07:31:32 crc kubenswrapper[4810]: I1003 07:31:32.088675 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:31:32 crc kubenswrapper[4810]: I1003 07:31:32.088770 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:31:33 crc kubenswrapper[4810]: I1003 07:31:33.313582 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48cb014-8481-4497-9047-8c319809f4cc" path="/var/lib/kubelet/pods/a48cb014-8481-4497-9047-8c319809f4cc/volumes" Oct 03 07:32:02 crc kubenswrapper[4810]: I1003 07:32:02.089054 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:32:02 crc kubenswrapper[4810]: I1003 07:32:02.089971 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.088728 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.091233 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.091599 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.092828 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.093257 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" gracePeriod=600 Oct 03 07:32:32 crc kubenswrapper[4810]: E1003 07:32:32.239089 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.291642 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" exitCode=0 Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.291690 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f"} Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.291728 4810 scope.go:117] "RemoveContainer" containerID="9ff04d25664a23c073479ef4951ce52ecbc3ec250b4325b1004701e702c616cd" Oct 03 07:32:32 crc kubenswrapper[4810]: I1003 07:32:32.292316 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:32:32 crc kubenswrapper[4810]: E1003 07:32:32.292554 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:32:45 crc kubenswrapper[4810]: I1003 07:32:45.301979 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:32:45 crc kubenswrapper[4810]: E1003 07:32:45.302645 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:32:56 crc kubenswrapper[4810]: I1003 07:32:56.303148 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:32:56 crc kubenswrapper[4810]: E1003 07:32:56.303956 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:33:09 crc kubenswrapper[4810]: I1003 07:33:09.303186 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:33:09 crc kubenswrapper[4810]: E1003 07:33:09.304203 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:33:21 crc kubenswrapper[4810]: I1003 07:33:21.303068 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:33:21 crc kubenswrapper[4810]: E1003 07:33:21.304336 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:33:33 crc kubenswrapper[4810]: I1003 07:33:33.302886 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:33:33 crc kubenswrapper[4810]: E1003 07:33:33.303631 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:33:45 crc kubenswrapper[4810]: I1003 07:33:45.303267 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:33:45 crc kubenswrapper[4810]: E1003 07:33:45.304185 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:34:00 crc kubenswrapper[4810]: I1003 07:34:00.304511 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:34:00 crc kubenswrapper[4810]: E1003 07:34:00.305630 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:34:12 crc kubenswrapper[4810]: I1003 07:34:12.303206 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:34:12 crc kubenswrapper[4810]: E1003 07:34:12.304052 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:34:25 crc kubenswrapper[4810]: I1003 07:34:25.302717 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:34:25 crc kubenswrapper[4810]: E1003 07:34:25.303500 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:34:39 crc kubenswrapper[4810]: I1003 07:34:39.302707 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:34:39 crc kubenswrapper[4810]: E1003 07:34:39.303649 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:34:54 crc kubenswrapper[4810]: I1003 07:34:54.303104 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:34:54 crc kubenswrapper[4810]: E1003 07:34:54.303921 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:35:07 crc kubenswrapper[4810]: I1003 07:35:07.309042 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:35:07 crc kubenswrapper[4810]: E1003 07:35:07.310146 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:35:22 crc kubenswrapper[4810]: I1003 07:35:22.302664 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:35:22 crc kubenswrapper[4810]: E1003 07:35:22.303520 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:35:36 crc kubenswrapper[4810]: I1003 07:35:36.303489 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:35:36 crc kubenswrapper[4810]: E1003 07:35:36.304499 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:35:51 crc kubenswrapper[4810]: I1003 07:35:51.302525 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:35:51 crc kubenswrapper[4810]: E1003 07:35:51.303295 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:36:03 crc kubenswrapper[4810]: I1003 07:36:03.302998 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:36:03 crc kubenswrapper[4810]: E1003 07:36:03.304138 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:36:18 crc kubenswrapper[4810]: I1003 07:36:18.303310 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:36:18 crc kubenswrapper[4810]: E1003 07:36:18.304167 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:36:32 crc kubenswrapper[4810]: I1003 07:36:32.303034 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:36:32 crc kubenswrapper[4810]: E1003 07:36:32.303783 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:36:46 crc kubenswrapper[4810]: I1003 07:36:46.302696 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:36:46 crc kubenswrapper[4810]: E1003 07:36:46.303443 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:37:01 crc kubenswrapper[4810]: I1003 07:37:01.302837 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:37:01 crc kubenswrapper[4810]: E1003 07:37:01.303653 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:37:12 crc kubenswrapper[4810]: I1003 07:37:12.303010 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:37:12 crc kubenswrapper[4810]: E1003 07:37:12.303807 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:37:26 crc kubenswrapper[4810]: I1003 07:37:26.301972 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:37:26 crc kubenswrapper[4810]: E1003 07:37:26.303286 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:37:40 crc kubenswrapper[4810]: I1003 07:37:40.303175 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:37:40 crc kubenswrapper[4810]: I1003 07:37:40.847292 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf"} Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.356284 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:30 crc kubenswrapper[4810]: E1003 07:38:30.357258 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="registry-server" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.357278 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="registry-server" Oct 03 07:38:30 crc kubenswrapper[4810]: E1003 07:38:30.357312 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="extract-utilities" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.357320 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="extract-utilities" Oct 03 07:38:30 crc kubenswrapper[4810]: E1003 07:38:30.357334 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="extract-content" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.357341 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="extract-content" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.357502 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48cb014-8481-4497-9047-8c319809f4cc" containerName="registry-server" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.361361 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.363784 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.395276 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkd2f\" (UniqueName: \"kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.395359 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.395433 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.497293 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.497372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkd2f\" (UniqueName: \"kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.497417 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.498013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.498726 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.521307 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkd2f\" (UniqueName: \"kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f\") pod \"redhat-marketplace-2rlgw\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:30 crc kubenswrapper[4810]: I1003 07:38:30.690550 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:31 crc kubenswrapper[4810]: I1003 07:38:31.000587 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:31 crc kubenswrapper[4810]: I1003 07:38:31.284635 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerStarted","Data":"e9533902ffb27818aeeaea4b52a31b110c498166fea34e6f1a8cea8b516dd86c"} Oct 03 07:38:32 crc kubenswrapper[4810]: I1003 07:38:32.293069 4810 generic.go:334] "Generic (PLEG): container finished" podID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerID="dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251" exitCode=0 Oct 03 07:38:32 crc kubenswrapper[4810]: I1003 07:38:32.293162 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerDied","Data":"dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251"} Oct 03 07:38:32 crc kubenswrapper[4810]: I1003 07:38:32.295380 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:38:34 crc kubenswrapper[4810]: I1003 07:38:34.328630 4810 generic.go:334] "Generic (PLEG): container finished" podID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerID="e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37" exitCode=0 Oct 03 07:38:34 crc kubenswrapper[4810]: I1003 07:38:34.328739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerDied","Data":"e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37"} Oct 03 07:38:35 crc kubenswrapper[4810]: I1003 07:38:35.340071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerStarted","Data":"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74"} Oct 03 07:38:35 crc kubenswrapper[4810]: I1003 07:38:35.364973 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2rlgw" podStartSLOduration=2.844432467 podStartE2EDuration="5.364956073s" podCreationTimestamp="2025-10-03 07:38:30 +0000 UTC" firstStartedPulling="2025-10-03 07:38:32.295121576 +0000 UTC m=+2545.722372311" lastFinishedPulling="2025-10-03 07:38:34.815645182 +0000 UTC m=+2548.242895917" observedRunningTime="2025-10-03 07:38:35.361657374 +0000 UTC m=+2548.788908129" watchObservedRunningTime="2025-10-03 07:38:35.364956073 +0000 UTC m=+2548.792206808" Oct 03 07:38:40 crc kubenswrapper[4810]: I1003 07:38:40.692376 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:40 crc kubenswrapper[4810]: I1003 07:38:40.692840 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:40 crc kubenswrapper[4810]: I1003 07:38:40.746585 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:41 crc kubenswrapper[4810]: I1003 07:38:41.434320 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:41 crc kubenswrapper[4810]: I1003 07:38:41.494659 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.409880 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2rlgw" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="registry-server" containerID="cri-o://692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74" gracePeriod=2 Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.857032 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.996825 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkd2f\" (UniqueName: \"kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f\") pod \"afb5f2d1-d776-4327-a060-8c3b33089fb6\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.997118 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities\") pod \"afb5f2d1-d776-4327-a060-8c3b33089fb6\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.997177 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content\") pod \"afb5f2d1-d776-4327-a060-8c3b33089fb6\" (UID: \"afb5f2d1-d776-4327-a060-8c3b33089fb6\") " Oct 03 07:38:43 crc kubenswrapper[4810]: I1003 07:38:43.998033 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities" (OuterVolumeSpecName: "utilities") pod "afb5f2d1-d776-4327-a060-8c3b33089fb6" (UID: "afb5f2d1-d776-4327-a060-8c3b33089fb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.003048 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f" (OuterVolumeSpecName: "kube-api-access-nkd2f") pod "afb5f2d1-d776-4327-a060-8c3b33089fb6" (UID: "afb5f2d1-d776-4327-a060-8c3b33089fb6"). InnerVolumeSpecName "kube-api-access-nkd2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.014224 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afb5f2d1-d776-4327-a060-8c3b33089fb6" (UID: "afb5f2d1-d776-4327-a060-8c3b33089fb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.098668 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.098729 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afb5f2d1-d776-4327-a060-8c3b33089fb6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.098755 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkd2f\" (UniqueName: \"kubernetes.io/projected/afb5f2d1-d776-4327-a060-8c3b33089fb6-kube-api-access-nkd2f\") on node \"crc\" DevicePath \"\"" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.419611 4810 generic.go:334] "Generic (PLEG): container finished" podID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerID="692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74" exitCode=0 Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.419698 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerDied","Data":"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74"} Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.419705 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2rlgw" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.419738 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2rlgw" event={"ID":"afb5f2d1-d776-4327-a060-8c3b33089fb6","Type":"ContainerDied","Data":"e9533902ffb27818aeeaea4b52a31b110c498166fea34e6f1a8cea8b516dd86c"} Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.419757 4810 scope.go:117] "RemoveContainer" containerID="692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.463828 4810 scope.go:117] "RemoveContainer" containerID="e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.473187 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.480770 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2rlgw"] Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.494113 4810 scope.go:117] "RemoveContainer" containerID="dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.531921 4810 scope.go:117] "RemoveContainer" containerID="692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74" Oct 03 07:38:44 crc kubenswrapper[4810]: E1003 07:38:44.532438 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74\": container with ID starting with 692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74 not found: ID does not exist" containerID="692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.532476 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74"} err="failed to get container status \"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74\": rpc error: code = NotFound desc = could not find container \"692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74\": container with ID starting with 692f416c564ccb9e24b639ff7788dc46ecc0396f960b8f96b8759cbd4f5e5e74 not found: ID does not exist" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.532499 4810 scope.go:117] "RemoveContainer" containerID="e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37" Oct 03 07:38:44 crc kubenswrapper[4810]: E1003 07:38:44.533110 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37\": container with ID starting with e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37 not found: ID does not exist" containerID="e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.533137 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37"} err="failed to get container status \"e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37\": rpc error: code = NotFound desc = could not find container \"e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37\": container with ID starting with e8b52dacf0e7924135c3190e15fefd19e036d3cc4516014fb673e4a4581eef37 not found: ID does not exist" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.533164 4810 scope.go:117] "RemoveContainer" containerID="dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251" Oct 03 07:38:44 crc kubenswrapper[4810]: E1003 07:38:44.533671 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251\": container with ID starting with dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251 not found: ID does not exist" containerID="dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251" Oct 03 07:38:44 crc kubenswrapper[4810]: I1003 07:38:44.533697 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251"} err="failed to get container status \"dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251\": rpc error: code = NotFound desc = could not find container \"dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251\": container with ID starting with dd88b0e1debb52844e4a3cff8248a46603a3439c4673add5f7618cbe68ebd251 not found: ID does not exist" Oct 03 07:38:45 crc kubenswrapper[4810]: I1003 07:38:45.314164 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" path="/var/lib/kubelet/pods/afb5f2d1-d776-4327-a060-8c3b33089fb6/volumes" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.650327 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:39:52 crc kubenswrapper[4810]: E1003 07:39:52.651579 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="extract-utilities" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.651596 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="extract-utilities" Oct 03 07:39:52 crc kubenswrapper[4810]: E1003 07:39:52.651618 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="registry-server" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.651624 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="registry-server" Oct 03 07:39:52 crc kubenswrapper[4810]: E1003 07:39:52.651638 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="extract-content" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.651648 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="extract-content" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.651829 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb5f2d1-d776-4327-a060-8c3b33089fb6" containerName="registry-server" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.653949 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.671265 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.745583 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.746292 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9h4q\" (UniqueName: \"kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.746359 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.847945 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.848040 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9h4q\" (UniqueName: \"kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.848089 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.848638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.848782 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.872226 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9h4q\" (UniqueName: \"kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q\") pod \"redhat-operators-8csj9\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:52 crc kubenswrapper[4810]: I1003 07:39:52.981164 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:39:53 crc kubenswrapper[4810]: I1003 07:39:53.423202 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:39:53 crc kubenswrapper[4810]: I1003 07:39:53.962849 4810 generic.go:334] "Generic (PLEG): container finished" podID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerID="6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47" exitCode=0 Oct 03 07:39:53 crc kubenswrapper[4810]: I1003 07:39:53.963190 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerDied","Data":"6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47"} Oct 03 07:39:53 crc kubenswrapper[4810]: I1003 07:39:53.963218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerStarted","Data":"92d2c2e81c19c3a4c5d19a1ad96a2bdf590243166eda9bb50e00e41a1616ddcc"} Oct 03 07:39:55 crc kubenswrapper[4810]: I1003 07:39:55.977079 4810 generic.go:334] "Generic (PLEG): container finished" podID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerID="fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef" exitCode=0 Oct 03 07:39:55 crc kubenswrapper[4810]: I1003 07:39:55.977193 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerDied","Data":"fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef"} Oct 03 07:39:56 crc kubenswrapper[4810]: I1003 07:39:56.986751 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerStarted","Data":"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec"} Oct 03 07:39:57 crc kubenswrapper[4810]: I1003 07:39:57.013010 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8csj9" podStartSLOduration=2.54906565 podStartE2EDuration="5.012983556s" podCreationTimestamp="2025-10-03 07:39:52 +0000 UTC" firstStartedPulling="2025-10-03 07:39:53.964644656 +0000 UTC m=+2627.391895391" lastFinishedPulling="2025-10-03 07:39:56.428562562 +0000 UTC m=+2629.855813297" observedRunningTime="2025-10-03 07:39:57.006783169 +0000 UTC m=+2630.434033924" watchObservedRunningTime="2025-10-03 07:39:57.012983556 +0000 UTC m=+2630.440234291" Oct 03 07:40:02 crc kubenswrapper[4810]: I1003 07:40:02.088428 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:40:02 crc kubenswrapper[4810]: I1003 07:40:02.088494 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:40:02 crc kubenswrapper[4810]: I1003 07:40:02.981494 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:02 crc kubenswrapper[4810]: I1003 07:40:02.981797 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:03 crc kubenswrapper[4810]: I1003 07:40:03.047211 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:03 crc kubenswrapper[4810]: I1003 07:40:03.088001 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:03 crc kubenswrapper[4810]: I1003 07:40:03.282607 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.056978 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8csj9" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="registry-server" containerID="cri-o://65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec" gracePeriod=2 Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.494312 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.651604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content\") pod \"1217a72b-a28f-4bb9-89c8-a600173ca051\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.651665 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9h4q\" (UniqueName: \"kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q\") pod \"1217a72b-a28f-4bb9-89c8-a600173ca051\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.651810 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities\") pod \"1217a72b-a28f-4bb9-89c8-a600173ca051\" (UID: \"1217a72b-a28f-4bb9-89c8-a600173ca051\") " Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.652731 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities" (OuterVolumeSpecName: "utilities") pod "1217a72b-a28f-4bb9-89c8-a600173ca051" (UID: "1217a72b-a28f-4bb9-89c8-a600173ca051"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.657084 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q" (OuterVolumeSpecName: "kube-api-access-d9h4q") pod "1217a72b-a28f-4bb9-89c8-a600173ca051" (UID: "1217a72b-a28f-4bb9-89c8-a600173ca051"). InnerVolumeSpecName "kube-api-access-d9h4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.753992 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9h4q\" (UniqueName: \"kubernetes.io/projected/1217a72b-a28f-4bb9-89c8-a600173ca051-kube-api-access-d9h4q\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.754033 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.759393 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1217a72b-a28f-4bb9-89c8-a600173ca051" (UID: "1217a72b-a28f-4bb9-89c8-a600173ca051"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:40:05 crc kubenswrapper[4810]: I1003 07:40:05.855499 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1217a72b-a28f-4bb9-89c8-a600173ca051-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.067608 4810 generic.go:334] "Generic (PLEG): container finished" podID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerID="65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec" exitCode=0 Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.067674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerDied","Data":"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec"} Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.067747 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8csj9" event={"ID":"1217a72b-a28f-4bb9-89c8-a600173ca051","Type":"ContainerDied","Data":"92d2c2e81c19c3a4c5d19a1ad96a2bdf590243166eda9bb50e00e41a1616ddcc"} Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.067755 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8csj9" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.067779 4810 scope.go:117] "RemoveContainer" containerID="65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.089128 4810 scope.go:117] "RemoveContainer" containerID="fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.115442 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.118802 4810 scope.go:117] "RemoveContainer" containerID="6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.120748 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8csj9"] Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.148650 4810 scope.go:117] "RemoveContainer" containerID="65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec" Oct 03 07:40:06 crc kubenswrapper[4810]: E1003 07:40:06.149154 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec\": container with ID starting with 65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec not found: ID does not exist" containerID="65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.149199 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec"} err="failed to get container status \"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec\": rpc error: code = NotFound desc = could not find container \"65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec\": container with ID starting with 65eb0213460c93a9d493a9d4e7dd764fb4a67ac481327ab989663f55c57704ec not found: ID does not exist" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.149225 4810 scope.go:117] "RemoveContainer" containerID="fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef" Oct 03 07:40:06 crc kubenswrapper[4810]: E1003 07:40:06.149726 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef\": container with ID starting with fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef not found: ID does not exist" containerID="fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.149776 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef"} err="failed to get container status \"fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef\": rpc error: code = NotFound desc = could not find container \"fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef\": container with ID starting with fcbc4b06735c70613f18aab0d1b674b9dcbdb4e1e9006b8cdee6789c2d0373ef not found: ID does not exist" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.149805 4810 scope.go:117] "RemoveContainer" containerID="6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47" Oct 03 07:40:06 crc kubenswrapper[4810]: E1003 07:40:06.150330 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47\": container with ID starting with 6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47 not found: ID does not exist" containerID="6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47" Oct 03 07:40:06 crc kubenswrapper[4810]: I1003 07:40:06.150367 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47"} err="failed to get container status \"6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47\": rpc error: code = NotFound desc = could not find container \"6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47\": container with ID starting with 6ae75b3bc8b2820d0d294a75d6b3c678cfd94f061d5cf2a5f25f17364d1fcd47 not found: ID does not exist" Oct 03 07:40:07 crc kubenswrapper[4810]: I1003 07:40:07.313350 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" path="/var/lib/kubelet/pods/1217a72b-a28f-4bb9-89c8-a600173ca051/volumes" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.861585 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:20 crc kubenswrapper[4810]: E1003 07:40:20.863000 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="extract-content" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.863034 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="extract-content" Oct 03 07:40:20 crc kubenswrapper[4810]: E1003 07:40:20.863071 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="extract-utilities" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.863090 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="extract-utilities" Oct 03 07:40:20 crc kubenswrapper[4810]: E1003 07:40:20.863128 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="registry-server" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.863146 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="registry-server" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.863536 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1217a72b-a28f-4bb9-89c8-a600173ca051" containerName="registry-server" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.865995 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.874091 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.981329 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84sh7\" (UniqueName: \"kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.981414 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:20 crc kubenswrapper[4810]: I1003 07:40:20.981476 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.082886 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84sh7\" (UniqueName: \"kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.082964 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.082996 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.083638 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.083803 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.104693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84sh7\" (UniqueName: \"kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7\") pod \"certified-operators-7bh74\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.184083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:21 crc kubenswrapper[4810]: I1003 07:40:21.527311 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:22 crc kubenswrapper[4810]: I1003 07:40:22.216022 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerID="dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0" exitCode=0 Oct 03 07:40:22 crc kubenswrapper[4810]: I1003 07:40:22.216158 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerDied","Data":"dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0"} Oct 03 07:40:22 crc kubenswrapper[4810]: I1003 07:40:22.216383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerStarted","Data":"d0f9160b39b0de78433e2298a13ef5eb56796606153df46981cb6cb5489a7414"} Oct 03 07:40:23 crc kubenswrapper[4810]: I1003 07:40:23.228080 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerStarted","Data":"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9"} Oct 03 07:40:24 crc kubenswrapper[4810]: I1003 07:40:24.237108 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerID="955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9" exitCode=0 Oct 03 07:40:24 crc kubenswrapper[4810]: I1003 07:40:24.237154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerDied","Data":"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9"} Oct 03 07:40:25 crc kubenswrapper[4810]: I1003 07:40:25.249643 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerStarted","Data":"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75"} Oct 03 07:40:25 crc kubenswrapper[4810]: I1003 07:40:25.274214 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7bh74" podStartSLOduration=2.781368213 podStartE2EDuration="5.274182704s" podCreationTimestamp="2025-10-03 07:40:20 +0000 UTC" firstStartedPulling="2025-10-03 07:40:22.217285576 +0000 UTC m=+2655.644536301" lastFinishedPulling="2025-10-03 07:40:24.710100057 +0000 UTC m=+2658.137350792" observedRunningTime="2025-10-03 07:40:25.269456318 +0000 UTC m=+2658.696707053" watchObservedRunningTime="2025-10-03 07:40:25.274182704 +0000 UTC m=+2658.701433439" Oct 03 07:40:31 crc kubenswrapper[4810]: I1003 07:40:31.185106 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:31 crc kubenswrapper[4810]: I1003 07:40:31.185698 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:31 crc kubenswrapper[4810]: I1003 07:40:31.246612 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:31 crc kubenswrapper[4810]: I1003 07:40:31.348394 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:31 crc kubenswrapper[4810]: I1003 07:40:31.486306 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:32 crc kubenswrapper[4810]: I1003 07:40:32.088530 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:40:32 crc kubenswrapper[4810]: I1003 07:40:32.088592 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.314365 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7bh74" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="registry-server" containerID="cri-o://a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75" gracePeriod=2 Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.754422 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.894887 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities\") pod \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.894973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content\") pod \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.895033 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84sh7\" (UniqueName: \"kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7\") pod \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\" (UID: \"d5795c6a-0f39-4a24-9fa1-816f21b8a094\") " Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.896045 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities" (OuterVolumeSpecName: "utilities") pod "d5795c6a-0f39-4a24-9fa1-816f21b8a094" (UID: "d5795c6a-0f39-4a24-9fa1-816f21b8a094"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.900384 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7" (OuterVolumeSpecName: "kube-api-access-84sh7") pod "d5795c6a-0f39-4a24-9fa1-816f21b8a094" (UID: "d5795c6a-0f39-4a24-9fa1-816f21b8a094"). InnerVolumeSpecName "kube-api-access-84sh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.996518 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:33 crc kubenswrapper[4810]: I1003 07:40:33.996572 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84sh7\" (UniqueName: \"kubernetes.io/projected/d5795c6a-0f39-4a24-9fa1-816f21b8a094-kube-api-access-84sh7\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.324321 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerID="a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75" exitCode=0 Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.324384 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerDied","Data":"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75"} Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.324437 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7bh74" event={"ID":"d5795c6a-0f39-4a24-9fa1-816f21b8a094","Type":"ContainerDied","Data":"d0f9160b39b0de78433e2298a13ef5eb56796606153df46981cb6cb5489a7414"} Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.324458 4810 scope.go:117] "RemoveContainer" containerID="a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.324459 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7bh74" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.352375 4810 scope.go:117] "RemoveContainer" containerID="955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.379748 4810 scope.go:117] "RemoveContainer" containerID="dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.406520 4810 scope.go:117] "RemoveContainer" containerID="a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75" Oct 03 07:40:34 crc kubenswrapper[4810]: E1003 07:40:34.407044 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75\": container with ID starting with a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75 not found: ID does not exist" containerID="a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.407073 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75"} err="failed to get container status \"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75\": rpc error: code = NotFound desc = could not find container \"a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75\": container with ID starting with a3cb797029c582e5fc8359334ade36c790303f479421175adeaa4d45056a6d75 not found: ID does not exist" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.407093 4810 scope.go:117] "RemoveContainer" containerID="955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9" Oct 03 07:40:34 crc kubenswrapper[4810]: E1003 07:40:34.407385 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9\": container with ID starting with 955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9 not found: ID does not exist" containerID="955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.407425 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9"} err="failed to get container status \"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9\": rpc error: code = NotFound desc = could not find container \"955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9\": container with ID starting with 955f2b7304fcc0e031c9a1d01ff1cb5a189fe2b9db5f5be691822a863fac7aa9 not found: ID does not exist" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.407451 4810 scope.go:117] "RemoveContainer" containerID="dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0" Oct 03 07:40:34 crc kubenswrapper[4810]: E1003 07:40:34.407957 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0\": container with ID starting with dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0 not found: ID does not exist" containerID="dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.408004 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0"} err="failed to get container status \"dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0\": rpc error: code = NotFound desc = could not find container \"dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0\": container with ID starting with dfe1aad6aef02ef50886b430f394bb881662b1c24a321812b3d420fb50362ec0 not found: ID does not exist" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.810958 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5795c6a-0f39-4a24-9fa1-816f21b8a094" (UID: "d5795c6a-0f39-4a24-9fa1-816f21b8a094"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.909707 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5795c6a-0f39-4a24-9fa1-816f21b8a094-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.955578 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:34 crc kubenswrapper[4810]: I1003 07:40:34.962911 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7bh74"] Oct 03 07:40:35 crc kubenswrapper[4810]: I1003 07:40:35.312626 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" path="/var/lib/kubelet/pods/d5795c6a-0f39-4a24-9fa1-816f21b8a094/volumes" Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.088739 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.089348 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.089404 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.090076 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.090123 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf" gracePeriod=600 Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.561271 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf" exitCode=0 Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.561353 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf"} Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.561915 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a"} Oct 03 07:41:02 crc kubenswrapper[4810]: I1003 07:41:02.561955 4810 scope.go:117] "RemoveContainer" containerID="a50f0518766dce41023a5145dcc01cf7c4ea30fffbd6389a88d3af15f78fd10f" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.533208 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:15 crc kubenswrapper[4810]: E1003 07:42:15.534568 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="extract-utilities" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.534592 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="extract-utilities" Oct 03 07:42:15 crc kubenswrapper[4810]: E1003 07:42:15.534757 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="extract-content" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.534772 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="extract-content" Oct 03 07:42:15 crc kubenswrapper[4810]: E1003 07:42:15.534790 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="registry-server" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.534805 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="registry-server" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.535073 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5795c6a-0f39-4a24-9fa1-816f21b8a094" containerName="registry-server" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.536836 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.548449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.633559 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.633627 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.633669 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzplc\" (UniqueName: \"kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.736547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.736651 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.736873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzplc\" (UniqueName: \"kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.737973 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.738219 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.775497 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzplc\" (UniqueName: \"kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc\") pod \"community-operators-wdxk9\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:15 crc kubenswrapper[4810]: I1003 07:42:15.863380 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:16 crc kubenswrapper[4810]: I1003 07:42:16.119951 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:16 crc kubenswrapper[4810]: I1003 07:42:16.241734 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerStarted","Data":"55096c816842e2df3e7691595b118895cad13eddfd4a7accd9464ba57cd676c6"} Oct 03 07:42:17 crc kubenswrapper[4810]: I1003 07:42:17.255944 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerID="c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc" exitCode=0 Oct 03 07:42:17 crc kubenswrapper[4810]: I1003 07:42:17.256028 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerDied","Data":"c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc"} Oct 03 07:42:18 crc kubenswrapper[4810]: I1003 07:42:18.267092 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerStarted","Data":"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5"} Oct 03 07:42:19 crc kubenswrapper[4810]: I1003 07:42:19.279089 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerID="1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5" exitCode=0 Oct 03 07:42:19 crc kubenswrapper[4810]: I1003 07:42:19.279199 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerDied","Data":"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5"} Oct 03 07:42:20 crc kubenswrapper[4810]: I1003 07:42:20.291556 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerStarted","Data":"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26"} Oct 03 07:42:20 crc kubenswrapper[4810]: I1003 07:42:20.318207 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wdxk9" podStartSLOduration=2.730931838 podStartE2EDuration="5.318181716s" podCreationTimestamp="2025-10-03 07:42:15 +0000 UTC" firstStartedPulling="2025-10-03 07:42:17.26135647 +0000 UTC m=+2770.688607205" lastFinishedPulling="2025-10-03 07:42:19.848606338 +0000 UTC m=+2773.275857083" observedRunningTime="2025-10-03 07:42:20.311960398 +0000 UTC m=+2773.739211143" watchObservedRunningTime="2025-10-03 07:42:20.318181716 +0000 UTC m=+2773.745432461" Oct 03 07:42:25 crc kubenswrapper[4810]: I1003 07:42:25.864062 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:25 crc kubenswrapper[4810]: I1003 07:42:25.864660 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:25 crc kubenswrapper[4810]: I1003 07:42:25.919162 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:26 crc kubenswrapper[4810]: I1003 07:42:26.397376 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:27 crc kubenswrapper[4810]: I1003 07:42:27.526407 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.367278 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wdxk9" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="registry-server" containerID="cri-o://0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26" gracePeriod=2 Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.802436 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.967499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities\") pod \"c3870c3e-94be-464c-8d0d-2f323a58066b\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.967622 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzplc\" (UniqueName: \"kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc\") pod \"c3870c3e-94be-464c-8d0d-2f323a58066b\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.967695 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content\") pod \"c3870c3e-94be-464c-8d0d-2f323a58066b\" (UID: \"c3870c3e-94be-464c-8d0d-2f323a58066b\") " Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.969605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities" (OuterVolumeSpecName: "utilities") pod "c3870c3e-94be-464c-8d0d-2f323a58066b" (UID: "c3870c3e-94be-464c-8d0d-2f323a58066b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:42:28 crc kubenswrapper[4810]: I1003 07:42:28.974509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc" (OuterVolumeSpecName: "kube-api-access-dzplc") pod "c3870c3e-94be-464c-8d0d-2f323a58066b" (UID: "c3870c3e-94be-464c-8d0d-2f323a58066b"). InnerVolumeSpecName "kube-api-access-dzplc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.069735 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzplc\" (UniqueName: \"kubernetes.io/projected/c3870c3e-94be-464c-8d0d-2f323a58066b-kube-api-access-dzplc\") on node \"crc\" DevicePath \"\"" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.069768 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.214270 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3870c3e-94be-464c-8d0d-2f323a58066b" (UID: "c3870c3e-94be-464c-8d0d-2f323a58066b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.271662 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3870c3e-94be-464c-8d0d-2f323a58066b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.375382 4810 generic.go:334] "Generic (PLEG): container finished" podID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerID="0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26" exitCode=0 Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.375436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerDied","Data":"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26"} Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.375470 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wdxk9" event={"ID":"c3870c3e-94be-464c-8d0d-2f323a58066b","Type":"ContainerDied","Data":"55096c816842e2df3e7691595b118895cad13eddfd4a7accd9464ba57cd676c6"} Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.375475 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wdxk9" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.375493 4810 scope.go:117] "RemoveContainer" containerID="0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.402236 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.404882 4810 scope.go:117] "RemoveContainer" containerID="1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.409000 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wdxk9"] Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.428451 4810 scope.go:117] "RemoveContainer" containerID="c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.455278 4810 scope.go:117] "RemoveContainer" containerID="0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26" Oct 03 07:42:29 crc kubenswrapper[4810]: E1003 07:42:29.455904 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26\": container with ID starting with 0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26 not found: ID does not exist" containerID="0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.455952 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26"} err="failed to get container status \"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26\": rpc error: code = NotFound desc = could not find container \"0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26\": container with ID starting with 0a255d48f11b91a126f10c5b05f654f173666d3adce6b808c8a9b29591eb4f26 not found: ID does not exist" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.455979 4810 scope.go:117] "RemoveContainer" containerID="1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5" Oct 03 07:42:29 crc kubenswrapper[4810]: E1003 07:42:29.456651 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5\": container with ID starting with 1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5 not found: ID does not exist" containerID="1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.456688 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5"} err="failed to get container status \"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5\": rpc error: code = NotFound desc = could not find container \"1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5\": container with ID starting with 1d0323c828b3bdd6d2e94e1218ec33875a4133e50e6aace567064234c0d3e6b5 not found: ID does not exist" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.456709 4810 scope.go:117] "RemoveContainer" containerID="c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc" Oct 03 07:42:29 crc kubenswrapper[4810]: E1003 07:42:29.457257 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc\": container with ID starting with c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc not found: ID does not exist" containerID="c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc" Oct 03 07:42:29 crc kubenswrapper[4810]: I1003 07:42:29.457280 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc"} err="failed to get container status \"c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc\": rpc error: code = NotFound desc = could not find container \"c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc\": container with ID starting with c0156f415b8dc0a4ebc2d27670d1fcaf9c1cc7df88f27619df0be0c236ae0afc not found: ID does not exist" Oct 03 07:42:31 crc kubenswrapper[4810]: I1003 07:42:31.313910 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" path="/var/lib/kubelet/pods/c3870c3e-94be-464c-8d0d-2f323a58066b/volumes" Oct 03 07:43:02 crc kubenswrapper[4810]: I1003 07:43:02.089042 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:43:02 crc kubenswrapper[4810]: I1003 07:43:02.090030 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:43:32 crc kubenswrapper[4810]: I1003 07:43:32.089001 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:43:32 crc kubenswrapper[4810]: I1003 07:43:32.089967 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:44:02 crc kubenswrapper[4810]: I1003 07:44:02.088977 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:44:02 crc kubenswrapper[4810]: I1003 07:44:02.089518 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:44:02 crc kubenswrapper[4810]: I1003 07:44:02.089571 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:44:02 crc kubenswrapper[4810]: I1003 07:44:02.090188 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:44:02 crc kubenswrapper[4810]: I1003 07:44:02.090242 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" gracePeriod=600 Oct 03 07:44:02 crc kubenswrapper[4810]: E1003 07:44:02.221675 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:44:03 crc kubenswrapper[4810]: I1003 07:44:03.171461 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" exitCode=0 Oct 03 07:44:03 crc kubenswrapper[4810]: I1003 07:44:03.171552 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a"} Oct 03 07:44:03 crc kubenswrapper[4810]: I1003 07:44:03.171670 4810 scope.go:117] "RemoveContainer" containerID="99650ae53b6364d509744fbf92d20cef53ea6eeb981d5567f7a1662a041d1edf" Oct 03 07:44:03 crc kubenswrapper[4810]: I1003 07:44:03.172695 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:44:03 crc kubenswrapper[4810]: E1003 07:44:03.173243 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:44:17 crc kubenswrapper[4810]: I1003 07:44:17.318187 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:44:17 crc kubenswrapper[4810]: E1003 07:44:17.319775 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:44:30 crc kubenswrapper[4810]: I1003 07:44:30.302623 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:44:30 crc kubenswrapper[4810]: E1003 07:44:30.303285 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:44:44 crc kubenswrapper[4810]: I1003 07:44:44.303532 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:44:44 crc kubenswrapper[4810]: E1003 07:44:44.304580 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:44:57 crc kubenswrapper[4810]: I1003 07:44:57.307774 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:44:57 crc kubenswrapper[4810]: E1003 07:44:57.308683 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.185177 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt"] Oct 03 07:45:00 crc kubenswrapper[4810]: E1003 07:45:00.186252 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="registry-server" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.186277 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="registry-server" Oct 03 07:45:00 crc kubenswrapper[4810]: E1003 07:45:00.186313 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="extract-content" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.186325 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="extract-content" Oct 03 07:45:00 crc kubenswrapper[4810]: E1003 07:45:00.186407 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="extract-utilities" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.186423 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="extract-utilities" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.186671 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3870c3e-94be-464c-8d0d-2f323a58066b" containerName="registry-server" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.187451 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.189567 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.189925 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.193510 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt"] Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.370415 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwj7\" (UniqueName: \"kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.370489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.370551 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.472680 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.472803 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.472978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czwj7\" (UniqueName: \"kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.474035 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.491259 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.502150 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwj7\" (UniqueName: \"kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7\") pod \"collect-profiles-29324625-xc6zt\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.522499 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:00 crc kubenswrapper[4810]: I1003 07:45:00.973213 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt"] Oct 03 07:45:01 crc kubenswrapper[4810]: I1003 07:45:01.684362 4810 generic.go:334] "Generic (PLEG): container finished" podID="68543805-1fb5-4859-8c83-85b363604567" containerID="2401285a921ee290e4e7e9187c697863dedfaa15d14d71ca8f9b427918dbde77" exitCode=0 Oct 03 07:45:01 crc kubenswrapper[4810]: I1003 07:45:01.684410 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" event={"ID":"68543805-1fb5-4859-8c83-85b363604567","Type":"ContainerDied","Data":"2401285a921ee290e4e7e9187c697863dedfaa15d14d71ca8f9b427918dbde77"} Oct 03 07:45:01 crc kubenswrapper[4810]: I1003 07:45:01.684731 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" event={"ID":"68543805-1fb5-4859-8c83-85b363604567","Type":"ContainerStarted","Data":"c143f9d16ce975194cb2348d30bb174bc22eeadd9cffc0782c7067009c81b501"} Oct 03 07:45:02 crc kubenswrapper[4810]: I1003 07:45:02.987873 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.110037 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czwj7\" (UniqueName: \"kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7\") pod \"68543805-1fb5-4859-8c83-85b363604567\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.110196 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume\") pod \"68543805-1fb5-4859-8c83-85b363604567\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.110412 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume\") pod \"68543805-1fb5-4859-8c83-85b363604567\" (UID: \"68543805-1fb5-4859-8c83-85b363604567\") " Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.111332 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume" (OuterVolumeSpecName: "config-volume") pod "68543805-1fb5-4859-8c83-85b363604567" (UID: "68543805-1fb5-4859-8c83-85b363604567"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.116261 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68543805-1fb5-4859-8c83-85b363604567" (UID: "68543805-1fb5-4859-8c83-85b363604567"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.120133 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7" (OuterVolumeSpecName: "kube-api-access-czwj7") pod "68543805-1fb5-4859-8c83-85b363604567" (UID: "68543805-1fb5-4859-8c83-85b363604567"). InnerVolumeSpecName "kube-api-access-czwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.212147 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68543805-1fb5-4859-8c83-85b363604567-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.212184 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czwj7\" (UniqueName: \"kubernetes.io/projected/68543805-1fb5-4859-8c83-85b363604567-kube-api-access-czwj7\") on node \"crc\" DevicePath \"\"" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.212203 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68543805-1fb5-4859-8c83-85b363604567-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.703401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" event={"ID":"68543805-1fb5-4859-8c83-85b363604567","Type":"ContainerDied","Data":"c143f9d16ce975194cb2348d30bb174bc22eeadd9cffc0782c7067009c81b501"} Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.703791 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c143f9d16ce975194cb2348d30bb174bc22eeadd9cffc0782c7067009c81b501" Oct 03 07:45:03 crc kubenswrapper[4810]: I1003 07:45:03.703438 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt" Oct 03 07:45:04 crc kubenswrapper[4810]: I1003 07:45:04.070421 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69"] Oct 03 07:45:04 crc kubenswrapper[4810]: I1003 07:45:04.075708 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324580-tnt69"] Oct 03 07:45:05 crc kubenswrapper[4810]: I1003 07:45:05.313333 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f286eee-b0b9-45c5-95d9-5ac9c44376ed" path="/var/lib/kubelet/pods/2f286eee-b0b9-45c5-95d9-5ac9c44376ed/volumes" Oct 03 07:45:12 crc kubenswrapper[4810]: I1003 07:45:12.302817 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:45:12 crc kubenswrapper[4810]: E1003 07:45:12.303649 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:45:14 crc kubenswrapper[4810]: I1003 07:45:14.012306 4810 scope.go:117] "RemoveContainer" containerID="fa5bbe7f3f0cba1cdc8bb399b734e2c841c6dcbc14c84d846c08c3d3d1ce32a3" Oct 03 07:45:23 crc kubenswrapper[4810]: I1003 07:45:23.302641 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:45:23 crc kubenswrapper[4810]: E1003 07:45:23.303663 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:45:36 crc kubenswrapper[4810]: I1003 07:45:36.302678 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:45:36 crc kubenswrapper[4810]: E1003 07:45:36.304291 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:45:51 crc kubenswrapper[4810]: I1003 07:45:51.303189 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:45:51 crc kubenswrapper[4810]: E1003 07:45:51.304240 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:46:03 crc kubenswrapper[4810]: I1003 07:46:03.303157 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:46:03 crc kubenswrapper[4810]: E1003 07:46:03.303998 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:46:14 crc kubenswrapper[4810]: I1003 07:46:14.304060 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:46:14 crc kubenswrapper[4810]: E1003 07:46:14.305037 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:46:26 crc kubenswrapper[4810]: I1003 07:46:26.302689 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:46:26 crc kubenswrapper[4810]: E1003 07:46:26.304138 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:46:39 crc kubenswrapper[4810]: I1003 07:46:39.303034 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:46:39 crc kubenswrapper[4810]: E1003 07:46:39.303959 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:46:54 crc kubenswrapper[4810]: I1003 07:46:54.302760 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:46:54 crc kubenswrapper[4810]: E1003 07:46:54.303737 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:47:05 crc kubenswrapper[4810]: I1003 07:47:05.303576 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:47:05 crc kubenswrapper[4810]: E1003 07:47:05.305582 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:47:18 crc kubenswrapper[4810]: I1003 07:47:18.303167 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:47:18 crc kubenswrapper[4810]: E1003 07:47:18.306182 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:47:32 crc kubenswrapper[4810]: I1003 07:47:32.302981 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:47:32 crc kubenswrapper[4810]: E1003 07:47:32.304025 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:47:43 crc kubenswrapper[4810]: I1003 07:47:43.303813 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:47:43 crc kubenswrapper[4810]: E1003 07:47:43.305360 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:47:57 crc kubenswrapper[4810]: I1003 07:47:57.322275 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:47:57 crc kubenswrapper[4810]: E1003 07:47:57.323167 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:48:11 crc kubenswrapper[4810]: I1003 07:48:11.303821 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:48:11 crc kubenswrapper[4810]: E1003 07:48:11.305635 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:48:24 crc kubenswrapper[4810]: I1003 07:48:24.302700 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:48:24 crc kubenswrapper[4810]: E1003 07:48:24.303670 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:48:35 crc kubenswrapper[4810]: I1003 07:48:35.302282 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:48:35 crc kubenswrapper[4810]: E1003 07:48:35.303030 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.197392 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:46 crc kubenswrapper[4810]: E1003 07:48:46.198337 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68543805-1fb5-4859-8c83-85b363604567" containerName="collect-profiles" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.198355 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68543805-1fb5-4859-8c83-85b363604567" containerName="collect-profiles" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.198554 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68543805-1fb5-4859-8c83-85b363604567" containerName="collect-profiles" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.199743 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.206725 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.332224 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2r6\" (UniqueName: \"kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.332356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.332405 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.433428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.433548 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2r6\" (UniqueName: \"kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.433604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.434168 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.434457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.455989 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2r6\" (UniqueName: \"kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6\") pod \"redhat-marketplace-pw29f\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.522688 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:46 crc kubenswrapper[4810]: I1003 07:48:46.947353 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:47 crc kubenswrapper[4810]: E1003 07:48:47.239467 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c8008c_ece9_4ded_8079_dc261d0b5d50.slice/crio-conmon-51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c8008c_ece9_4ded_8079_dc261d0b5d50.slice/crio-51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997.scope\": RecentStats: unable to find data in memory cache]" Oct 03 07:48:47 crc kubenswrapper[4810]: I1003 07:48:47.307508 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:48:47 crc kubenswrapper[4810]: E1003 07:48:47.307712 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:48:47 crc kubenswrapper[4810]: I1003 07:48:47.721044 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerID="51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997" exitCode=0 Oct 03 07:48:47 crc kubenswrapper[4810]: I1003 07:48:47.721112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerDied","Data":"51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997"} Oct 03 07:48:47 crc kubenswrapper[4810]: I1003 07:48:47.721171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerStarted","Data":"6c3cb918850958e42cb5f9167bed69dd9b5bbdf81318333c64f4668170d0afe2"} Oct 03 07:48:47 crc kubenswrapper[4810]: I1003 07:48:47.723578 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:48:49 crc kubenswrapper[4810]: I1003 07:48:49.740943 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerID="2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321" exitCode=0 Oct 03 07:48:49 crc kubenswrapper[4810]: I1003 07:48:49.741043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerDied","Data":"2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321"} Oct 03 07:48:50 crc kubenswrapper[4810]: I1003 07:48:50.753907 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerStarted","Data":"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a"} Oct 03 07:48:50 crc kubenswrapper[4810]: I1003 07:48:50.775343 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pw29f" podStartSLOduration=2.268768014 podStartE2EDuration="4.775300529s" podCreationTimestamp="2025-10-03 07:48:46 +0000 UTC" firstStartedPulling="2025-10-03 07:48:47.723260241 +0000 UTC m=+3161.150510986" lastFinishedPulling="2025-10-03 07:48:50.229792726 +0000 UTC m=+3163.657043501" observedRunningTime="2025-10-03 07:48:50.773819419 +0000 UTC m=+3164.201070154" watchObservedRunningTime="2025-10-03 07:48:50.775300529 +0000 UTC m=+3164.202551274" Oct 03 07:48:56 crc kubenswrapper[4810]: I1003 07:48:56.524530 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:56 crc kubenswrapper[4810]: I1003 07:48:56.525099 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:56 crc kubenswrapper[4810]: I1003 07:48:56.623441 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:56 crc kubenswrapper[4810]: I1003 07:48:56.858766 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:56 crc kubenswrapper[4810]: I1003 07:48:56.917599 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:58 crc kubenswrapper[4810]: I1003 07:48:58.818526 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pw29f" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="registry-server" containerID="cri-o://fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a" gracePeriod=2 Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.256135 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.346982 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2r6\" (UniqueName: \"kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6\") pod \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.347094 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content\") pod \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.347216 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities\") pod \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\" (UID: \"d5c8008c-ece9-4ded-8079-dc261d0b5d50\") " Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.348631 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities" (OuterVolumeSpecName: "utilities") pod "d5c8008c-ece9-4ded-8079-dc261d0b5d50" (UID: "d5c8008c-ece9-4ded-8079-dc261d0b5d50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.355405 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6" (OuterVolumeSpecName: "kube-api-access-zq2r6") pod "d5c8008c-ece9-4ded-8079-dc261d0b5d50" (UID: "d5c8008c-ece9-4ded-8079-dc261d0b5d50"). InnerVolumeSpecName "kube-api-access-zq2r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.371127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5c8008c-ece9-4ded-8079-dc261d0b5d50" (UID: "d5c8008c-ece9-4ded-8079-dc261d0b5d50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.449196 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.449250 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5c8008c-ece9-4ded-8079-dc261d0b5d50-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.449266 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2r6\" (UniqueName: \"kubernetes.io/projected/d5c8008c-ece9-4ded-8079-dc261d0b5d50-kube-api-access-zq2r6\") on node \"crc\" DevicePath \"\"" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.830098 4810 generic.go:334] "Generic (PLEG): container finished" podID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerID="fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a" exitCode=0 Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.830148 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerDied","Data":"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a"} Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.830179 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pw29f" event={"ID":"d5c8008c-ece9-4ded-8079-dc261d0b5d50","Type":"ContainerDied","Data":"6c3cb918850958e42cb5f9167bed69dd9b5bbdf81318333c64f4668170d0afe2"} Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.830202 4810 scope.go:117] "RemoveContainer" containerID="fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.830330 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pw29f" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.857605 4810 scope.go:117] "RemoveContainer" containerID="2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.878463 4810 scope.go:117] "RemoveContainer" containerID="51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.892703 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.899241 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pw29f"] Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.906997 4810 scope.go:117] "RemoveContainer" containerID="fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a" Oct 03 07:48:59 crc kubenswrapper[4810]: E1003 07:48:59.907477 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a\": container with ID starting with fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a not found: ID does not exist" containerID="fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.907525 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a"} err="failed to get container status \"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a\": rpc error: code = NotFound desc = could not find container \"fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a\": container with ID starting with fd112d18c088f893816be7da1fca09381d73d699c4e5e540826c1451a2f6020a not found: ID does not exist" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.907553 4810 scope.go:117] "RemoveContainer" containerID="2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321" Oct 03 07:48:59 crc kubenswrapper[4810]: E1003 07:48:59.907952 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321\": container with ID starting with 2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321 not found: ID does not exist" containerID="2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.908010 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321"} err="failed to get container status \"2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321\": rpc error: code = NotFound desc = could not find container \"2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321\": container with ID starting with 2b02152b4852f98a4ee2d56e4258dfb061cd5f91b6be47a0fcc4c32c7d0f4321 not found: ID does not exist" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.908045 4810 scope.go:117] "RemoveContainer" containerID="51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997" Oct 03 07:48:59 crc kubenswrapper[4810]: E1003 07:48:59.908395 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997\": container with ID starting with 51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997 not found: ID does not exist" containerID="51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997" Oct 03 07:48:59 crc kubenswrapper[4810]: I1003 07:48:59.908425 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997"} err="failed to get container status \"51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997\": rpc error: code = NotFound desc = could not find container \"51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997\": container with ID starting with 51f72b7415b8c71a9587f6f29e633e21ec92b1059f85b01703bbc3805b7d6997 not found: ID does not exist" Oct 03 07:49:00 crc kubenswrapper[4810]: I1003 07:49:00.302557 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:49:00 crc kubenswrapper[4810]: E1003 07:49:00.302835 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:49:01 crc kubenswrapper[4810]: I1003 07:49:01.311813 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" path="/var/lib/kubelet/pods/d5c8008c-ece9-4ded-8079-dc261d0b5d50/volumes" Oct 03 07:49:11 crc kubenswrapper[4810]: I1003 07:49:11.302478 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:49:11 crc kubenswrapper[4810]: I1003 07:49:11.941499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899"} Oct 03 07:51:32 crc kubenswrapper[4810]: I1003 07:51:32.088371 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:51:32 crc kubenswrapper[4810]: I1003 07:51:32.088882 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:52:02 crc kubenswrapper[4810]: I1003 07:52:02.089206 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:52:02 crc kubenswrapper[4810]: I1003 07:52:02.089935 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.474617 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:28 crc kubenswrapper[4810]: E1003 07:52:28.475521 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="registry-server" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.475534 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="registry-server" Oct 03 07:52:28 crc kubenswrapper[4810]: E1003 07:52:28.475554 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="extract-content" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.475560 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="extract-content" Oct 03 07:52:28 crc kubenswrapper[4810]: E1003 07:52:28.475579 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="extract-utilities" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.475585 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="extract-utilities" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.475724 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c8008c-ece9-4ded-8079-dc261d0b5d50" containerName="registry-server" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.476789 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.485750 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.569460 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqvsf\" (UniqueName: \"kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.569746 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.569795 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.671141 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqvsf\" (UniqueName: \"kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.671203 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.671250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.671730 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.671867 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.691844 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqvsf\" (UniqueName: \"kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf\") pod \"community-operators-2zmfp\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:28 crc kubenswrapper[4810]: I1003 07:52:28.798237 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:29 crc kubenswrapper[4810]: I1003 07:52:29.312451 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:29 crc kubenswrapper[4810]: I1003 07:52:29.588425 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerStarted","Data":"c6a0d8bbef282765c5df5acdb7bad418513c56e94539172d1b803f10a5723db8"} Oct 03 07:52:30 crc kubenswrapper[4810]: I1003 07:52:30.598502 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c506dad-34e3-45b7-b866-016b32d433f1" containerID="97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4" exitCode=0 Oct 03 07:52:30 crc kubenswrapper[4810]: I1003 07:52:30.598573 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerDied","Data":"97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4"} Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.088631 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.089323 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.089457 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.091106 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.091312 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899" gracePeriod=600 Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.616757 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899" exitCode=0 Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.616806 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899"} Oct 03 07:52:32 crc kubenswrapper[4810]: I1003 07:52:32.616842 4810 scope.go:117] "RemoveContainer" containerID="02097ede51fdcd99c999fb88bc5c01beea0ebd875821b372494647c83d8bcf7a" Oct 03 07:52:34 crc kubenswrapper[4810]: I1003 07:52:34.652427 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f"} Oct 03 07:52:34 crc kubenswrapper[4810]: I1003 07:52:34.657371 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c506dad-34e3-45b7-b866-016b32d433f1" containerID="b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c" exitCode=0 Oct 03 07:52:34 crc kubenswrapper[4810]: I1003 07:52:34.657426 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerDied","Data":"b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c"} Oct 03 07:52:37 crc kubenswrapper[4810]: I1003 07:52:37.682466 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerStarted","Data":"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60"} Oct 03 07:52:37 crc kubenswrapper[4810]: I1003 07:52:37.710583 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2zmfp" podStartSLOduration=3.470261302 podStartE2EDuration="9.710562789s" podCreationTimestamp="2025-10-03 07:52:28 +0000 UTC" firstStartedPulling="2025-10-03 07:52:30.600801931 +0000 UTC m=+3384.028052676" lastFinishedPulling="2025-10-03 07:52:36.841103428 +0000 UTC m=+3390.268354163" observedRunningTime="2025-10-03 07:52:37.706024358 +0000 UTC m=+3391.133275093" watchObservedRunningTime="2025-10-03 07:52:37.710562789 +0000 UTC m=+3391.137813524" Oct 03 07:52:38 crc kubenswrapper[4810]: I1003 07:52:38.799388 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:38 crc kubenswrapper[4810]: I1003 07:52:38.799443 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:39 crc kubenswrapper[4810]: I1003 07:52:39.847420 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2zmfp" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="registry-server" probeResult="failure" output=< Oct 03 07:52:39 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 07:52:39 crc kubenswrapper[4810]: > Oct 03 07:52:48 crc kubenswrapper[4810]: I1003 07:52:48.873548 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:48 crc kubenswrapper[4810]: I1003 07:52:48.936949 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:49 crc kubenswrapper[4810]: I1003 07:52:49.124837 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:50 crc kubenswrapper[4810]: I1003 07:52:50.796821 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2zmfp" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="registry-server" containerID="cri-o://696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60" gracePeriod=2 Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.795945 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.806064 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zmfp" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.806084 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerDied","Data":"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60"} Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.806126 4810 scope.go:117] "RemoveContainer" containerID="696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.808164 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c506dad-34e3-45b7-b866-016b32d433f1" containerID="696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60" exitCode=0 Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.808303 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zmfp" event={"ID":"0c506dad-34e3-45b7-b866-016b32d433f1","Type":"ContainerDied","Data":"c6a0d8bbef282765c5df5acdb7bad418513c56e94539172d1b803f10a5723db8"} Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.834384 4810 scope.go:117] "RemoveContainer" containerID="b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.850070 4810 scope.go:117] "RemoveContainer" containerID="97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.873100 4810 scope.go:117] "RemoveContainer" containerID="696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60" Oct 03 07:52:51 crc kubenswrapper[4810]: E1003 07:52:51.873537 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60\": container with ID starting with 696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60 not found: ID does not exist" containerID="696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.873577 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60"} err="failed to get container status \"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60\": rpc error: code = NotFound desc = could not find container \"696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60\": container with ID starting with 696954455c3f66332429b548b586783801915ce4b1687969cb92d53c7e793a60 not found: ID does not exist" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.873598 4810 scope.go:117] "RemoveContainer" containerID="b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c" Oct 03 07:52:51 crc kubenswrapper[4810]: E1003 07:52:51.873855 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c\": container with ID starting with b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c not found: ID does not exist" containerID="b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.873877 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c"} err="failed to get container status \"b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c\": rpc error: code = NotFound desc = could not find container \"b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c\": container with ID starting with b1d5a231038a56d6577e24ed599c05a1fdc31ef73ed14f640f56fb778bcd941c not found: ID does not exist" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.873908 4810 scope.go:117] "RemoveContainer" containerID="97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4" Oct 03 07:52:51 crc kubenswrapper[4810]: E1003 07:52:51.874125 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4\": container with ID starting with 97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4 not found: ID does not exist" containerID="97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.874147 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4"} err="failed to get container status \"97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4\": rpc error: code = NotFound desc = could not find container \"97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4\": container with ID starting with 97cce617a5f1787b69bd60f4dc83803069f543acfeef8855cab05c5ac0c930e4 not found: ID does not exist" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.959022 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities\") pod \"0c506dad-34e3-45b7-b866-016b32d433f1\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.959515 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqvsf\" (UniqueName: \"kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf\") pod \"0c506dad-34e3-45b7-b866-016b32d433f1\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.959815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content\") pod \"0c506dad-34e3-45b7-b866-016b32d433f1\" (UID: \"0c506dad-34e3-45b7-b866-016b32d433f1\") " Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.960380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities" (OuterVolumeSpecName: "utilities") pod "0c506dad-34e3-45b7-b866-016b32d433f1" (UID: "0c506dad-34e3-45b7-b866-016b32d433f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.961704 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:51 crc kubenswrapper[4810]: I1003 07:52:51.965278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf" (OuterVolumeSpecName: "kube-api-access-wqvsf") pod "0c506dad-34e3-45b7-b866-016b32d433f1" (UID: "0c506dad-34e3-45b7-b866-016b32d433f1"). InnerVolumeSpecName "kube-api-access-wqvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:52:52 crc kubenswrapper[4810]: I1003 07:52:52.020945 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c506dad-34e3-45b7-b866-016b32d433f1" (UID: "0c506dad-34e3-45b7-b866-016b32d433f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:52:52 crc kubenswrapper[4810]: I1003 07:52:52.062755 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c506dad-34e3-45b7-b866-016b32d433f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:52 crc kubenswrapper[4810]: I1003 07:52:52.062807 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqvsf\" (UniqueName: \"kubernetes.io/projected/0c506dad-34e3-45b7-b866-016b32d433f1-kube-api-access-wqvsf\") on node \"crc\" DevicePath \"\"" Oct 03 07:52:52 crc kubenswrapper[4810]: I1003 07:52:52.141295 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:52 crc kubenswrapper[4810]: I1003 07:52:52.147022 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2zmfp"] Oct 03 07:52:53 crc kubenswrapper[4810]: I1003 07:52:53.317544 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" path="/var/lib/kubelet/pods/0c506dad-34e3-45b7-b866-016b32d433f1/volumes" Oct 03 07:55:02 crc kubenswrapper[4810]: I1003 07:55:02.089241 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:55:02 crc kubenswrapper[4810]: I1003 07:55:02.089840 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:55:32 crc kubenswrapper[4810]: I1003 07:55:32.088740 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:55:32 crc kubenswrapper[4810]: I1003 07:55:32.089357 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.088784 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.090153 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.090220 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.091055 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.091113 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" gracePeriod=600 Oct 03 07:56:02 crc kubenswrapper[4810]: E1003 07:56:02.224674 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.311108 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" exitCode=0 Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.311183 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f"} Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.311530 4810 scope.go:117] "RemoveContainer" containerID="1d8ab2e37783d2930c29eaa833f88ddbe66a06c0167808adc2eb5aa7317db899" Oct 03 07:56:02 crc kubenswrapper[4810]: I1003 07:56:02.311878 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:56:02 crc kubenswrapper[4810]: E1003 07:56:02.312151 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:17 crc kubenswrapper[4810]: I1003 07:56:17.308136 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:56:17 crc kubenswrapper[4810]: E1003 07:56:17.309129 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:31 crc kubenswrapper[4810]: I1003 07:56:31.302955 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:56:31 crc kubenswrapper[4810]: E1003 07:56:31.303808 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:43 crc kubenswrapper[4810]: I1003 07:56:43.304839 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:56:43 crc kubenswrapper[4810]: E1003 07:56:43.306584 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.010464 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:56:52 crc kubenswrapper[4810]: E1003 07:56:52.011534 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="extract-content" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.011558 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="extract-content" Oct 03 07:56:52 crc kubenswrapper[4810]: E1003 07:56:52.011579 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="extract-utilities" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.011586 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="extract-utilities" Oct 03 07:56:52 crc kubenswrapper[4810]: E1003 07:56:52.011616 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="registry-server" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.011623 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="registry-server" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.011806 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c506dad-34e3-45b7-b866-016b32d433f1" containerName="registry-server" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.013648 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.034534 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.116093 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.116175 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pnm5\" (UniqueName: \"kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.116202 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.217914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.217981 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pnm5\" (UniqueName: \"kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.218002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.218386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.218408 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.239826 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pnm5\" (UniqueName: \"kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5\") pod \"redhat-operators-j2rvj\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.340917 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:56:52 crc kubenswrapper[4810]: I1003 07:56:52.771179 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.730914 4810 generic.go:334] "Generic (PLEG): container finished" podID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerID="abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72" exitCode=0 Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.731220 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerDied","Data":"abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72"} Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.731260 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerStarted","Data":"3e9dc50a6e2656137900ddf15395cc32acada3e46bdf4aa39d8454d2c713785c"} Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.733303 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.809405 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4t2nc"] Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.811450 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.830719 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t2nc"] Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.942080 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-utilities\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.942136 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-catalog-content\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:53 crc kubenswrapper[4810]: I1003 07:56:53.942231 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54cbr\" (UniqueName: \"kubernetes.io/projected/6da6ea85-e420-40f6-829c-5983a746b478-kube-api-access-54cbr\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.043226 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54cbr\" (UniqueName: \"kubernetes.io/projected/6da6ea85-e420-40f6-829c-5983a746b478-kube-api-access-54cbr\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.043354 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-utilities\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.043394 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-catalog-content\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.043858 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-catalog-content\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.044078 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da6ea85-e420-40f6-829c-5983a746b478-utilities\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.070084 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54cbr\" (UniqueName: \"kubernetes.io/projected/6da6ea85-e420-40f6-829c-5983a746b478-kube-api-access-54cbr\") pod \"certified-operators-4t2nc\" (UID: \"6da6ea85-e420-40f6-829c-5983a746b478\") " pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.154382 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.676637 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t2nc"] Oct 03 07:56:54 crc kubenswrapper[4810]: W1003 07:56:54.686535 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da6ea85_e420_40f6_829c_5983a746b478.slice/crio-9b59af24e2ed75f1186149472c94590f2fab715bf495a73b86a3591ee39a5510 WatchSource:0}: Error finding container 9b59af24e2ed75f1186149472c94590f2fab715bf495a73b86a3591ee39a5510: Status 404 returned error can't find the container with id 9b59af24e2ed75f1186149472c94590f2fab715bf495a73b86a3591ee39a5510 Oct 03 07:56:54 crc kubenswrapper[4810]: I1003 07:56:54.743293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t2nc" event={"ID":"6da6ea85-e420-40f6-829c-5983a746b478","Type":"ContainerStarted","Data":"9b59af24e2ed75f1186149472c94590f2fab715bf495a73b86a3591ee39a5510"} Oct 03 07:56:55 crc kubenswrapper[4810]: I1003 07:56:55.752405 4810 generic.go:334] "Generic (PLEG): container finished" podID="6da6ea85-e420-40f6-829c-5983a746b478" containerID="0a5b4c0644fc43c42d2ee127dd2167f653f8aec686b2c832e0006af366cc9d35" exitCode=0 Oct 03 07:56:55 crc kubenswrapper[4810]: I1003 07:56:55.752455 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t2nc" event={"ID":"6da6ea85-e420-40f6-829c-5983a746b478","Type":"ContainerDied","Data":"0a5b4c0644fc43c42d2ee127dd2167f653f8aec686b2c832e0006af366cc9d35"} Oct 03 07:56:57 crc kubenswrapper[4810]: I1003 07:56:57.309225 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:56:57 crc kubenswrapper[4810]: E1003 07:56:57.309971 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:56:57 crc kubenswrapper[4810]: I1003 07:56:57.771993 4810 generic.go:334] "Generic (PLEG): container finished" podID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerID="262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1" exitCode=0 Oct 03 07:56:57 crc kubenswrapper[4810]: I1003 07:56:57.772046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerDied","Data":"262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1"} Oct 03 07:57:00 crc kubenswrapper[4810]: I1003 07:57:00.796542 4810 generic.go:334] "Generic (PLEG): container finished" podID="6da6ea85-e420-40f6-829c-5983a746b478" containerID="910dc87be9c98d3194dfcb63e2d284d20f70bcc88c4152169327b6510cf7e93a" exitCode=0 Oct 03 07:57:00 crc kubenswrapper[4810]: I1003 07:57:00.796666 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t2nc" event={"ID":"6da6ea85-e420-40f6-829c-5983a746b478","Type":"ContainerDied","Data":"910dc87be9c98d3194dfcb63e2d284d20f70bcc88c4152169327b6510cf7e93a"} Oct 03 07:57:00 crc kubenswrapper[4810]: I1003 07:57:00.802397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerStarted","Data":"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d"} Oct 03 07:57:00 crc kubenswrapper[4810]: I1003 07:57:00.839745 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j2rvj" podStartSLOduration=3.888083217 podStartE2EDuration="9.839722304s" podCreationTimestamp="2025-10-03 07:56:51 +0000 UTC" firstStartedPulling="2025-10-03 07:56:53.73273181 +0000 UTC m=+3647.159982585" lastFinishedPulling="2025-10-03 07:56:59.684370907 +0000 UTC m=+3653.111621672" observedRunningTime="2025-10-03 07:57:00.836479818 +0000 UTC m=+3654.263730573" watchObservedRunningTime="2025-10-03 07:57:00.839722304 +0000 UTC m=+3654.266973039" Oct 03 07:57:02 crc kubenswrapper[4810]: I1003 07:57:02.341955 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:02 crc kubenswrapper[4810]: I1003 07:57:02.343069 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:02 crc kubenswrapper[4810]: I1003 07:57:02.820948 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4t2nc" event={"ID":"6da6ea85-e420-40f6-829c-5983a746b478","Type":"ContainerStarted","Data":"def772fd14543962cbed67c0e5516a974f002f7dcd149b3fb0773829e3be4a72"} Oct 03 07:57:02 crc kubenswrapper[4810]: I1003 07:57:02.841974 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4t2nc" podStartSLOduration=3.958906354 podStartE2EDuration="9.841956829s" podCreationTimestamp="2025-10-03 07:56:53 +0000 UTC" firstStartedPulling="2025-10-03 07:56:55.754107359 +0000 UTC m=+3649.181358104" lastFinishedPulling="2025-10-03 07:57:01.637157844 +0000 UTC m=+3655.064408579" observedRunningTime="2025-10-03 07:57:02.837302954 +0000 UTC m=+3656.264553689" watchObservedRunningTime="2025-10-03 07:57:02.841956829 +0000 UTC m=+3656.269207564" Oct 03 07:57:03 crc kubenswrapper[4810]: I1003 07:57:03.382980 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j2rvj" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="registry-server" probeResult="failure" output=< Oct 03 07:57:03 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 07:57:03 crc kubenswrapper[4810]: > Oct 03 07:57:04 crc kubenswrapper[4810]: I1003 07:57:04.155370 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:57:04 crc kubenswrapper[4810]: I1003 07:57:04.155415 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:57:04 crc kubenswrapper[4810]: I1003 07:57:04.195334 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:57:12 crc kubenswrapper[4810]: I1003 07:57:12.303800 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:57:12 crc kubenswrapper[4810]: E1003 07:57:12.304961 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:57:12 crc kubenswrapper[4810]: I1003 07:57:12.411665 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:12 crc kubenswrapper[4810]: I1003 07:57:12.459014 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:12 crc kubenswrapper[4810]: I1003 07:57:12.655027 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:57:13 crc kubenswrapper[4810]: I1003 07:57:13.923478 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j2rvj" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="registry-server" containerID="cri-o://8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d" gracePeriod=2 Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.209351 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4t2nc" Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.885647 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4t2nc"] Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.931401 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.932307 4810 generic.go:334] "Generic (PLEG): container finished" podID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerID="8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d" exitCode=0 Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.932956 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerDied","Data":"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d"} Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.933032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j2rvj" event={"ID":"9d0cb663-2a74-49e6-80ad-e6ee06aa7095","Type":"ContainerDied","Data":"3e9dc50a6e2656137900ddf15395cc32acada3e46bdf4aa39d8454d2c713785c"} Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.933051 4810 scope.go:117] "RemoveContainer" containerID="8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d" Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.960718 4810 scope.go:117] "RemoveContainer" containerID="262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1" Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.983045 4810 scope.go:117] "RemoveContainer" containerID="abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72" Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.997852 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pnm5\" (UniqueName: \"kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5\") pod \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.997997 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content\") pod \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.998120 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities\") pod \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\" (UID: \"9d0cb663-2a74-49e6-80ad-e6ee06aa7095\") " Oct 03 07:57:14 crc kubenswrapper[4810]: I1003 07:57:14.999580 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities" (OuterVolumeSpecName: "utilities") pod "9d0cb663-2a74-49e6-80ad-e6ee06aa7095" (UID: "9d0cb663-2a74-49e6-80ad-e6ee06aa7095"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.004803 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5" (OuterVolumeSpecName: "kube-api-access-8pnm5") pod "9d0cb663-2a74-49e6-80ad-e6ee06aa7095" (UID: "9d0cb663-2a74-49e6-80ad-e6ee06aa7095"). InnerVolumeSpecName "kube-api-access-8pnm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.007208 4810 scope.go:117] "RemoveContainer" containerID="8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d" Oct 03 07:57:15 crc kubenswrapper[4810]: E1003 07:57:15.007818 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d\": container with ID starting with 8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d not found: ID does not exist" containerID="8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.007874 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d"} err="failed to get container status \"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d\": rpc error: code = NotFound desc = could not find container \"8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d\": container with ID starting with 8e371315153043bddf72d0a8953e43cfd6bfce01edd5eaf566703e88c67dea8d not found: ID does not exist" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.008012 4810 scope.go:117] "RemoveContainer" containerID="262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1" Oct 03 07:57:15 crc kubenswrapper[4810]: E1003 07:57:15.008561 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1\": container with ID starting with 262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1 not found: ID does not exist" containerID="262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.008610 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1"} err="failed to get container status \"262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1\": rpc error: code = NotFound desc = could not find container \"262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1\": container with ID starting with 262ddffc8d06fd8183e72fc00efae7aaf17315a4331c0177431fb43193e170d1 not found: ID does not exist" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.008643 4810 scope.go:117] "RemoveContainer" containerID="abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72" Oct 03 07:57:15 crc kubenswrapper[4810]: E1003 07:57:15.009344 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72\": container with ID starting with abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72 not found: ID does not exist" containerID="abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.009395 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72"} err="failed to get container status \"abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72\": rpc error: code = NotFound desc = could not find container \"abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72\": container with ID starting with abbaf47b1b19e8fcb2664ed84495e8295c7d6cecf43db6ed2afb27d53d4b8b72 not found: ID does not exist" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.060590 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.061045 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmvkz" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="registry-server" containerID="cri-o://97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792" gracePeriod=2 Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.099038 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pnm5\" (UniqueName: \"kubernetes.io/projected/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-kube-api-access-8pnm5\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.099073 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.114573 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d0cb663-2a74-49e6-80ad-e6ee06aa7095" (UID: "9d0cb663-2a74-49e6-80ad-e6ee06aa7095"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.202032 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d0cb663-2a74-49e6-80ad-e6ee06aa7095-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.469599 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.505209 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities\") pod \"0a41a867-b179-4665-b79f-de759a045f82\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.505477 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content\") pod \"0a41a867-b179-4665-b79f-de759a045f82\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.505617 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpv9s\" (UniqueName: \"kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s\") pod \"0a41a867-b179-4665-b79f-de759a045f82\" (UID: \"0a41a867-b179-4665-b79f-de759a045f82\") " Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.505785 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities" (OuterVolumeSpecName: "utilities") pod "0a41a867-b179-4665-b79f-de759a045f82" (UID: "0a41a867-b179-4665-b79f-de759a045f82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.506683 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.517817 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s" (OuterVolumeSpecName: "kube-api-access-vpv9s") pod "0a41a867-b179-4665-b79f-de759a045f82" (UID: "0a41a867-b179-4665-b79f-de759a045f82"). InnerVolumeSpecName "kube-api-access-vpv9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.558461 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a41a867-b179-4665-b79f-de759a045f82" (UID: "0a41a867-b179-4665-b79f-de759a045f82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.609300 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a41a867-b179-4665-b79f-de759a045f82-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.609368 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpv9s\" (UniqueName: \"kubernetes.io/projected/0a41a867-b179-4665-b79f-de759a045f82-kube-api-access-vpv9s\") on node \"crc\" DevicePath \"\"" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.942944 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j2rvj" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.947769 4810 generic.go:334] "Generic (PLEG): container finished" podID="0a41a867-b179-4665-b79f-de759a045f82" containerID="97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792" exitCode=0 Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.947813 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerDied","Data":"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792"} Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.947842 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmvkz" event={"ID":"0a41a867-b179-4665-b79f-de759a045f82","Type":"ContainerDied","Data":"d45ccaaee75f8d623d012a748930b68d0f48aa06813830245229fa8591620395"} Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.947864 4810 scope.go:117] "RemoveContainer" containerID="97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.947882 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmvkz" Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.971796 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.978677 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j2rvj"] Oct 03 07:57:15 crc kubenswrapper[4810]: I1003 07:57:15.982140 4810 scope.go:117] "RemoveContainer" containerID="2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.002687 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.006406 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmvkz"] Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.045114 4810 scope.go:117] "RemoveContainer" containerID="92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.077771 4810 scope.go:117] "RemoveContainer" containerID="97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792" Oct 03 07:57:16 crc kubenswrapper[4810]: E1003 07:57:16.078462 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792\": container with ID starting with 97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792 not found: ID does not exist" containerID="97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.078505 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792"} err="failed to get container status \"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792\": rpc error: code = NotFound desc = could not find container \"97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792\": container with ID starting with 97e8030646d8512110be6c03a5607115758084a5430fff3681540ac5fa516792 not found: ID does not exist" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.078525 4810 scope.go:117] "RemoveContainer" containerID="2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781" Oct 03 07:57:16 crc kubenswrapper[4810]: E1003 07:57:16.078816 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781\": container with ID starting with 2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781 not found: ID does not exist" containerID="2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.078856 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781"} err="failed to get container status \"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781\": rpc error: code = NotFound desc = could not find container \"2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781\": container with ID starting with 2dfa0657f2e3ea575c7436dc4b65857f9c5d935f44a71bb29212d52b312a3781 not found: ID does not exist" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.078875 4810 scope.go:117] "RemoveContainer" containerID="92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229" Oct 03 07:57:16 crc kubenswrapper[4810]: E1003 07:57:16.079381 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229\": container with ID starting with 92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229 not found: ID does not exist" containerID="92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229" Oct 03 07:57:16 crc kubenswrapper[4810]: I1003 07:57:16.079455 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229"} err="failed to get container status \"92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229\": rpc error: code = NotFound desc = could not find container \"92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229\": container with ID starting with 92b2617ae9a43a8355293d3ea67e74ea2d8647f0e8cefcf17b3213907d650229 not found: ID does not exist" Oct 03 07:57:17 crc kubenswrapper[4810]: I1003 07:57:17.312399 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a41a867-b179-4665-b79f-de759a045f82" path="/var/lib/kubelet/pods/0a41a867-b179-4665-b79f-de759a045f82/volumes" Oct 03 07:57:17 crc kubenswrapper[4810]: I1003 07:57:17.313715 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" path="/var/lib/kubelet/pods/9d0cb663-2a74-49e6-80ad-e6ee06aa7095/volumes" Oct 03 07:57:27 crc kubenswrapper[4810]: I1003 07:57:27.310672 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:57:27 crc kubenswrapper[4810]: E1003 07:57:27.311670 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:57:39 crc kubenswrapper[4810]: I1003 07:57:39.302087 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:57:39 crc kubenswrapper[4810]: E1003 07:57:39.302968 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:57:52 crc kubenswrapper[4810]: I1003 07:57:52.302357 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:57:52 crc kubenswrapper[4810]: E1003 07:57:52.303411 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:58:03 crc kubenswrapper[4810]: I1003 07:58:03.303407 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:58:03 crc kubenswrapper[4810]: E1003 07:58:03.304492 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:58:18 crc kubenswrapper[4810]: I1003 07:58:18.303161 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:58:18 crc kubenswrapper[4810]: E1003 07:58:18.304409 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:58:33 crc kubenswrapper[4810]: I1003 07:58:33.302829 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:58:33 crc kubenswrapper[4810]: E1003 07:58:33.303927 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:58:45 crc kubenswrapper[4810]: I1003 07:58:45.303829 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:58:45 crc kubenswrapper[4810]: E1003 07:58:45.306219 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:00 crc kubenswrapper[4810]: I1003 07:59:00.303533 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:59:00 crc kubenswrapper[4810]: E1003 07:59:00.304801 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:11 crc kubenswrapper[4810]: I1003 07:59:11.302622 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:59:11 crc kubenswrapper[4810]: E1003 07:59:11.303584 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:26 crc kubenswrapper[4810]: I1003 07:59:26.302254 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:59:26 crc kubenswrapper[4810]: E1003 07:59:26.303235 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:37 crc kubenswrapper[4810]: I1003 07:59:37.306046 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:59:37 crc kubenswrapper[4810]: E1003 07:59:37.306868 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.506033 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.507995 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508027 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.508059 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="extract-content" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508075 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="extract-content" Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.508104 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508122 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.508160 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="extract-content" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508177 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="extract-content" Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.508226 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="extract-utilities" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508243 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="extract-utilities" Oct 03 07:59:45 crc kubenswrapper[4810]: E1003 07:59:45.508267 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="extract-utilities" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508280 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="extract-utilities" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508560 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0cb663-2a74-49e6-80ad-e6ee06aa7095" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.508595 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a41a867-b179-4665-b79f-de759a045f82" containerName="registry-server" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.510684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.522928 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.634215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.634336 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlcxp\" (UniqueName: \"kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.634556 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.736368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.736482 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlcxp\" (UniqueName: \"kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.736553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.737376 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.737509 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.774077 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlcxp\" (UniqueName: \"kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp\") pod \"redhat-marketplace-hkm8v\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:45 crc kubenswrapper[4810]: I1003 07:59:45.854272 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:46 crc kubenswrapper[4810]: I1003 07:59:46.085420 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:46 crc kubenswrapper[4810]: I1003 07:59:46.282871 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerStarted","Data":"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279"} Oct 03 07:59:46 crc kubenswrapper[4810]: I1003 07:59:46.282970 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerStarted","Data":"136b3eb5a0623fde5975b5fc1715df647f04106348d2a133ee28eecc53317457"} Oct 03 07:59:47 crc kubenswrapper[4810]: I1003 07:59:47.296602 4810 generic.go:334] "Generic (PLEG): container finished" podID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerID="42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279" exitCode=0 Oct 03 07:59:47 crc kubenswrapper[4810]: I1003 07:59:47.296739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerDied","Data":"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279"} Oct 03 07:59:49 crc kubenswrapper[4810]: I1003 07:59:49.314910 4810 generic.go:334] "Generic (PLEG): container finished" podID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerID="ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842" exitCode=0 Oct 03 07:59:49 crc kubenswrapper[4810]: I1003 07:59:49.315543 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerDied","Data":"ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842"} Oct 03 07:59:50 crc kubenswrapper[4810]: I1003 07:59:50.326394 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerStarted","Data":"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c"} Oct 03 07:59:50 crc kubenswrapper[4810]: I1003 07:59:50.348614 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hkm8v" podStartSLOduration=2.843172429 podStartE2EDuration="5.348596574s" podCreationTimestamp="2025-10-03 07:59:45 +0000 UTC" firstStartedPulling="2025-10-03 07:59:47.302121514 +0000 UTC m=+3820.729372259" lastFinishedPulling="2025-10-03 07:59:49.807545679 +0000 UTC m=+3823.234796404" observedRunningTime="2025-10-03 07:59:50.343824136 +0000 UTC m=+3823.771074871" watchObservedRunningTime="2025-10-03 07:59:50.348596574 +0000 UTC m=+3823.775847309" Oct 03 07:59:51 crc kubenswrapper[4810]: I1003 07:59:51.303018 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 07:59:51 crc kubenswrapper[4810]: E1003 07:59:51.303382 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 07:59:55 crc kubenswrapper[4810]: I1003 07:59:55.855506 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:55 crc kubenswrapper[4810]: I1003 07:59:55.856448 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:55 crc kubenswrapper[4810]: I1003 07:59:55.920007 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:56 crc kubenswrapper[4810]: I1003 07:59:56.463795 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:56 crc kubenswrapper[4810]: I1003 07:59:56.531475 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.407783 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hkm8v" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="registry-server" containerID="cri-o://39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c" gracePeriod=2 Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.825882 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.963399 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content\") pod \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.963577 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities\") pod \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.963608 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlcxp\" (UniqueName: \"kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp\") pod \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\" (UID: \"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5\") " Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.964751 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities" (OuterVolumeSpecName: "utilities") pod "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" (UID: "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.973769 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp" (OuterVolumeSpecName: "kube-api-access-qlcxp") pod "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" (UID: "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5"). InnerVolumeSpecName "kube-api-access-qlcxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 07:59:58 crc kubenswrapper[4810]: I1003 07:59:58.991913 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" (UID: "fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.067390 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.067485 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlcxp\" (UniqueName: \"kubernetes.io/projected/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-kube-api-access-qlcxp\") on node \"crc\" DevicePath \"\"" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.067515 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.417299 4810 generic.go:334] "Generic (PLEG): container finished" podID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerID="39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c" exitCode=0 Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.417357 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerDied","Data":"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c"} Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.417397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkm8v" event={"ID":"fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5","Type":"ContainerDied","Data":"136b3eb5a0623fde5975b5fc1715df647f04106348d2a133ee28eecc53317457"} Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.417420 4810 scope.go:117] "RemoveContainer" containerID="39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.417458 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkm8v" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.440801 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.447224 4810 scope.go:117] "RemoveContainer" containerID="ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.448079 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkm8v"] Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.468167 4810 scope.go:117] "RemoveContainer" containerID="42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.493823 4810 scope.go:117] "RemoveContainer" containerID="39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c" Oct 03 07:59:59 crc kubenswrapper[4810]: E1003 07:59:59.494327 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c\": container with ID starting with 39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c not found: ID does not exist" containerID="39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.494376 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c"} err="failed to get container status \"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c\": rpc error: code = NotFound desc = could not find container \"39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c\": container with ID starting with 39cea0a58571b7390e016242832adb4f8b36134dfe025effa55fb3819beceb1c not found: ID does not exist" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.494410 4810 scope.go:117] "RemoveContainer" containerID="ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842" Oct 03 07:59:59 crc kubenswrapper[4810]: E1003 07:59:59.494768 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842\": container with ID starting with ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842 not found: ID does not exist" containerID="ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.494848 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842"} err="failed to get container status \"ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842\": rpc error: code = NotFound desc = could not find container \"ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842\": container with ID starting with ae617da63c77469ea7407a21e194e9ef8b250189792b340aa1a2e1d11c5d2842 not found: ID does not exist" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.494883 4810 scope.go:117] "RemoveContainer" containerID="42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279" Oct 03 07:59:59 crc kubenswrapper[4810]: E1003 07:59:59.495142 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279\": container with ID starting with 42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279 not found: ID does not exist" containerID="42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279" Oct 03 07:59:59 crc kubenswrapper[4810]: I1003 07:59:59.495172 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279"} err="failed to get container status \"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279\": rpc error: code = NotFound desc = could not find container \"42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279\": container with ID starting with 42ed000ff512957c6193538edddcf8946ae4a4a424cbfe637348a7340452f279 not found: ID does not exist" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.176067 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2"] Oct 03 08:00:00 crc kubenswrapper[4810]: E1003 08:00:00.176790 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="extract-content" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.176814 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="extract-content" Oct 03 08:00:00 crc kubenswrapper[4810]: E1003 08:00:00.176848 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="extract-utilities" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.176856 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="extract-utilities" Oct 03 08:00:00 crc kubenswrapper[4810]: E1003 08:00:00.176873 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="registry-server" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.176881 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="registry-server" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.177074 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" containerName="registry-server" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.177672 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.179750 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.179960 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.182904 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.182938 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swx9n\" (UniqueName: \"kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.182968 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.186820 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2"] Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.283756 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.283814 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swx9n\" (UniqueName: \"kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.284043 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.285271 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.289404 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.301762 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swx9n\" (UniqueName: \"kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n\") pod \"collect-profiles-29324640-qkwb2\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.494807 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:00 crc kubenswrapper[4810]: I1003 08:00:00.721048 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2"] Oct 03 08:00:00 crc kubenswrapper[4810]: W1003 08:00:00.725003 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod867e3ee1_2322_4e01_af2f_882920acb70d.slice/crio-616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a WatchSource:0}: Error finding container 616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a: Status 404 returned error can't find the container with id 616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a Oct 03 08:00:01 crc kubenswrapper[4810]: I1003 08:00:01.314247 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5" path="/var/lib/kubelet/pods/fd2fcb8d-7b91-435b-ac0f-b3007d48e4a5/volumes" Oct 03 08:00:01 crc kubenswrapper[4810]: I1003 08:00:01.436859 4810 generic.go:334] "Generic (PLEG): container finished" podID="867e3ee1-2322-4e01-af2f-882920acb70d" containerID="1277d005e3686ba6fecbf89587112952bea4383de4b42fded4b4e54798f9463e" exitCode=0 Oct 03 08:00:01 crc kubenswrapper[4810]: I1003 08:00:01.436928 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" event={"ID":"867e3ee1-2322-4e01-af2f-882920acb70d","Type":"ContainerDied","Data":"1277d005e3686ba6fecbf89587112952bea4383de4b42fded4b4e54798f9463e"} Oct 03 08:00:01 crc kubenswrapper[4810]: I1003 08:00:01.436963 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" event={"ID":"867e3ee1-2322-4e01-af2f-882920acb70d","Type":"ContainerStarted","Data":"616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a"} Oct 03 08:00:02 crc kubenswrapper[4810]: I1003 08:00:02.303044 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:00:02 crc kubenswrapper[4810]: E1003 08:00:02.303297 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:00:02 crc kubenswrapper[4810]: I1003 08:00:02.882883 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.024732 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume\") pod \"867e3ee1-2322-4e01-af2f-882920acb70d\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.025077 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume\") pod \"867e3ee1-2322-4e01-af2f-882920acb70d\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.025243 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swx9n\" (UniqueName: \"kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n\") pod \"867e3ee1-2322-4e01-af2f-882920acb70d\" (UID: \"867e3ee1-2322-4e01-af2f-882920acb70d\") " Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.026505 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume" (OuterVolumeSpecName: "config-volume") pod "867e3ee1-2322-4e01-af2f-882920acb70d" (UID: "867e3ee1-2322-4e01-af2f-882920acb70d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.030579 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n" (OuterVolumeSpecName: "kube-api-access-swx9n") pod "867e3ee1-2322-4e01-af2f-882920acb70d" (UID: "867e3ee1-2322-4e01-af2f-882920acb70d"). InnerVolumeSpecName "kube-api-access-swx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.036990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "867e3ee1-2322-4e01-af2f-882920acb70d" (UID: "867e3ee1-2322-4e01-af2f-882920acb70d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.126885 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swx9n\" (UniqueName: \"kubernetes.io/projected/867e3ee1-2322-4e01-af2f-882920acb70d-kube-api-access-swx9n\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.126978 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/867e3ee1-2322-4e01-af2f-882920acb70d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.127002 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/867e3ee1-2322-4e01-af2f-882920acb70d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.452869 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" event={"ID":"867e3ee1-2322-4e01-af2f-882920acb70d","Type":"ContainerDied","Data":"616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a"} Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.452937 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616c263dfad592d4f3ed6be5f6cfc2855dac6407bf98b32555b3a1f2c6876f6a" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.453015 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2" Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.965804 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk"] Oct 03 08:00:03 crc kubenswrapper[4810]: I1003 08:00:03.973274 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324595-plsxk"] Oct 03 08:00:05 crc kubenswrapper[4810]: I1003 08:00:05.313331 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0450540-6f02-4b2c-9a2b-39d93ddf303d" path="/var/lib/kubelet/pods/c0450540-6f02-4b2c-9a2b-39d93ddf303d/volumes" Oct 03 08:00:14 crc kubenswrapper[4810]: I1003 08:00:14.302515 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:00:14 crc kubenswrapper[4810]: E1003 08:00:14.303750 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:00:14 crc kubenswrapper[4810]: I1003 08:00:14.361599 4810 scope.go:117] "RemoveContainer" containerID="71662c3f83c16baacce2daf019b96af84f623461be54e3c6f0821a5d7af83e88" Oct 03 08:00:28 crc kubenswrapper[4810]: I1003 08:00:28.302281 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:00:28 crc kubenswrapper[4810]: E1003 08:00:28.303566 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:00:41 crc kubenswrapper[4810]: I1003 08:00:41.302780 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:00:41 crc kubenswrapper[4810]: E1003 08:00:41.303971 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:00:53 crc kubenswrapper[4810]: I1003 08:00:53.303298 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:00:53 crc kubenswrapper[4810]: E1003 08:00:53.304512 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:01:04 crc kubenswrapper[4810]: I1003 08:01:04.302804 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:01:04 crc kubenswrapper[4810]: I1003 08:01:04.984767 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7"} Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.588380 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:36 crc kubenswrapper[4810]: E1003 08:02:36.590079 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867e3ee1-2322-4e01-af2f-882920acb70d" containerName="collect-profiles" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.590106 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="867e3ee1-2322-4e01-af2f-882920acb70d" containerName="collect-profiles" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.590321 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="867e3ee1-2322-4e01-af2f-882920acb70d" containerName="collect-profiles" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.592007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.602202 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.731193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxtp\" (UniqueName: \"kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.731338 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.731411 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.833414 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.833496 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.833569 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxtp\" (UniqueName: \"kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.834144 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.834463 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.868919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxtp\" (UniqueName: \"kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp\") pod \"community-operators-7j7ft\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:36 crc kubenswrapper[4810]: I1003 08:02:36.924474 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:37 crc kubenswrapper[4810]: I1003 08:02:37.472489 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:37 crc kubenswrapper[4810]: I1003 08:02:37.875789 4810 generic.go:334] "Generic (PLEG): container finished" podID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerID="8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd" exitCode=0 Oct 03 08:02:37 crc kubenswrapper[4810]: I1003 08:02:37.875918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerDied","Data":"8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd"} Oct 03 08:02:37 crc kubenswrapper[4810]: I1003 08:02:37.876129 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerStarted","Data":"fa50d30a72bb3bea7b312199d340607dd83b345e00463aa48c869494dc17c966"} Oct 03 08:02:37 crc kubenswrapper[4810]: I1003 08:02:37.877850 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:02:39 crc kubenswrapper[4810]: I1003 08:02:39.894583 4810 generic.go:334] "Generic (PLEG): container finished" podID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerID="869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0" exitCode=0 Oct 03 08:02:39 crc kubenswrapper[4810]: I1003 08:02:39.894677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerDied","Data":"869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0"} Oct 03 08:02:40 crc kubenswrapper[4810]: I1003 08:02:40.909002 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerStarted","Data":"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688"} Oct 03 08:02:46 crc kubenswrapper[4810]: I1003 08:02:46.924961 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:46 crc kubenswrapper[4810]: I1003 08:02:46.925842 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:46 crc kubenswrapper[4810]: I1003 08:02:46.996732 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:47 crc kubenswrapper[4810]: I1003 08:02:47.029313 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7j7ft" podStartSLOduration=8.574927662 podStartE2EDuration="11.029284513s" podCreationTimestamp="2025-10-03 08:02:36 +0000 UTC" firstStartedPulling="2025-10-03 08:02:37.877580756 +0000 UTC m=+3991.304831481" lastFinishedPulling="2025-10-03 08:02:40.331937597 +0000 UTC m=+3993.759188332" observedRunningTime="2025-10-03 08:02:40.930454065 +0000 UTC m=+3994.357704810" watchObservedRunningTime="2025-10-03 08:02:47.029284513 +0000 UTC m=+4000.456535288" Oct 03 08:02:47 crc kubenswrapper[4810]: I1003 08:02:47.057306 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:47 crc kubenswrapper[4810]: I1003 08:02:47.240548 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:48 crc kubenswrapper[4810]: I1003 08:02:48.974225 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7j7ft" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="registry-server" containerID="cri-o://a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688" gracePeriod=2 Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.388614 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.531979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities\") pod \"ed72a87e-b235-4995-b8d7-6e26da346d99\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.532158 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkxtp\" (UniqueName: \"kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp\") pod \"ed72a87e-b235-4995-b8d7-6e26da346d99\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.532197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content\") pod \"ed72a87e-b235-4995-b8d7-6e26da346d99\" (UID: \"ed72a87e-b235-4995-b8d7-6e26da346d99\") " Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.533301 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities" (OuterVolumeSpecName: "utilities") pod "ed72a87e-b235-4995-b8d7-6e26da346d99" (UID: "ed72a87e-b235-4995-b8d7-6e26da346d99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.538044 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp" (OuterVolumeSpecName: "kube-api-access-zkxtp") pod "ed72a87e-b235-4995-b8d7-6e26da346d99" (UID: "ed72a87e-b235-4995-b8d7-6e26da346d99"). InnerVolumeSpecName "kube-api-access-zkxtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.608273 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed72a87e-b235-4995-b8d7-6e26da346d99" (UID: "ed72a87e-b235-4995-b8d7-6e26da346d99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.633685 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.633750 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkxtp\" (UniqueName: \"kubernetes.io/projected/ed72a87e-b235-4995-b8d7-6e26da346d99-kube-api-access-zkxtp\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.633767 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72a87e-b235-4995-b8d7-6e26da346d99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.989769 4810 generic.go:334] "Generic (PLEG): container finished" podID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerID="a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688" exitCode=0 Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.989831 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerDied","Data":"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688"} Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.989873 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7j7ft" event={"ID":"ed72a87e-b235-4995-b8d7-6e26da346d99","Type":"ContainerDied","Data":"fa50d30a72bb3bea7b312199d340607dd83b345e00463aa48c869494dc17c966"} Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.989913 4810 scope.go:117] "RemoveContainer" containerID="a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688" Oct 03 08:02:49 crc kubenswrapper[4810]: I1003 08:02:49.990003 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7j7ft" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.033642 4810 scope.go:117] "RemoveContainer" containerID="869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.035054 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.041024 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7j7ft"] Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.086149 4810 scope.go:117] "RemoveContainer" containerID="8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.108736 4810 scope.go:117] "RemoveContainer" containerID="a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688" Oct 03 08:02:50 crc kubenswrapper[4810]: E1003 08:02:50.109286 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688\": container with ID starting with a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688 not found: ID does not exist" containerID="a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.109310 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688"} err="failed to get container status \"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688\": rpc error: code = NotFound desc = could not find container \"a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688\": container with ID starting with a457063b96fd2d09c8375e8755735d1e5441038ecf81d865edacc2c26b3ea688 not found: ID does not exist" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.109332 4810 scope.go:117] "RemoveContainer" containerID="869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0" Oct 03 08:02:50 crc kubenswrapper[4810]: E1003 08:02:50.111304 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0\": container with ID starting with 869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0 not found: ID does not exist" containerID="869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.111327 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0"} err="failed to get container status \"869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0\": rpc error: code = NotFound desc = could not find container \"869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0\": container with ID starting with 869d330cf8b6d255f314c7816cd9d3cfe74141f9cca40d09c94602e2664fefa0 not found: ID does not exist" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.111341 4810 scope.go:117] "RemoveContainer" containerID="8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd" Oct 03 08:02:50 crc kubenswrapper[4810]: E1003 08:02:50.111578 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd\": container with ID starting with 8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd not found: ID does not exist" containerID="8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd" Oct 03 08:02:50 crc kubenswrapper[4810]: I1003 08:02:50.111598 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd"} err="failed to get container status \"8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd\": rpc error: code = NotFound desc = could not find container \"8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd\": container with ID starting with 8625d3bd481ac7870106ef408216e03f600df08e44395bfda6e39f4441ddd4bd not found: ID does not exist" Oct 03 08:02:51 crc kubenswrapper[4810]: I1003 08:02:51.315187 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" path="/var/lib/kubelet/pods/ed72a87e-b235-4995-b8d7-6e26da346d99/volumes" Oct 03 08:03:32 crc kubenswrapper[4810]: I1003 08:03:32.089145 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:03:32 crc kubenswrapper[4810]: I1003 08:03:32.089711 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:04:02 crc kubenswrapper[4810]: I1003 08:04:02.088827 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:04:02 crc kubenswrapper[4810]: I1003 08:04:02.089622 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.089077 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.089876 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.090032 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.091047 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.091182 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7" gracePeriod=600 Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.914336 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7" exitCode=0 Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.914412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7"} Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.914850 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54"} Oct 03 08:04:32 crc kubenswrapper[4810]: I1003 08:04:32.914878 4810 scope.go:117] "RemoveContainer" containerID="d1adb92898ec073c44091d3e8ba3d652e8934495d205bdcb582297c6717ed98f" Oct 03 08:06:32 crc kubenswrapper[4810]: I1003 08:06:32.089005 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:06:32 crc kubenswrapper[4810]: I1003 08:06:32.090985 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:07:02 crc kubenswrapper[4810]: I1003 08:07:02.089018 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:07:02 crc kubenswrapper[4810]: I1003 08:07:02.089639 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.500107 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:13 crc kubenswrapper[4810]: E1003 08:07:13.503142 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="registry-server" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.503176 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="registry-server" Oct 03 08:07:13 crc kubenswrapper[4810]: E1003 08:07:13.503226 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="extract-utilities" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.503242 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="extract-utilities" Oct 03 08:07:13 crc kubenswrapper[4810]: E1003 08:07:13.503265 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="extract-content" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.503280 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="extract-content" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.503572 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed72a87e-b235-4995-b8d7-6e26da346d99" containerName="registry-server" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.505400 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.513022 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.692267 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.692360 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.692387 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhlx\" (UniqueName: \"kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.794238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.794327 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.794352 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhlx\" (UniqueName: \"kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.795373 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.795438 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.820449 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhlx\" (UniqueName: \"kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx\") pod \"redhat-operators-w5wcs\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:13 crc kubenswrapper[4810]: I1003 08:07:13.828051 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:14 crc kubenswrapper[4810]: I1003 08:07:14.337206 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:15 crc kubenswrapper[4810]: I1003 08:07:15.345801 4810 generic.go:334] "Generic (PLEG): container finished" podID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerID="9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f" exitCode=0 Oct 03 08:07:15 crc kubenswrapper[4810]: I1003 08:07:15.345880 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerDied","Data":"9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f"} Oct 03 08:07:15 crc kubenswrapper[4810]: I1003 08:07:15.346454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerStarted","Data":"1fc55005aacbc5c96bad84dec8673615d1c9fa85b2db8108101b9847764cad24"} Oct 03 08:07:16 crc kubenswrapper[4810]: I1003 08:07:16.379289 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerStarted","Data":"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c"} Oct 03 08:07:17 crc kubenswrapper[4810]: I1003 08:07:17.388591 4810 generic.go:334] "Generic (PLEG): container finished" podID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerID="965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c" exitCode=0 Oct 03 08:07:17 crc kubenswrapper[4810]: I1003 08:07:17.388675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerDied","Data":"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c"} Oct 03 08:07:18 crc kubenswrapper[4810]: I1003 08:07:18.396863 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerStarted","Data":"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38"} Oct 03 08:07:23 crc kubenswrapper[4810]: I1003 08:07:23.829237 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:23 crc kubenswrapper[4810]: I1003 08:07:23.830051 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:23 crc kubenswrapper[4810]: I1003 08:07:23.883242 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:23 crc kubenswrapper[4810]: I1003 08:07:23.914584 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5wcs" podStartSLOduration=8.203102274999999 podStartE2EDuration="10.914552487s" podCreationTimestamp="2025-10-03 08:07:13 +0000 UTC" firstStartedPulling="2025-10-03 08:07:15.34921743 +0000 UTC m=+4268.776468205" lastFinishedPulling="2025-10-03 08:07:18.060667372 +0000 UTC m=+4271.487918417" observedRunningTime="2025-10-03 08:07:18.416398174 +0000 UTC m=+4271.843648929" watchObservedRunningTime="2025-10-03 08:07:23.914552487 +0000 UTC m=+4277.341803232" Oct 03 08:07:24 crc kubenswrapper[4810]: I1003 08:07:24.509391 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:24 crc kubenswrapper[4810]: I1003 08:07:24.576097 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.458376 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5wcs" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="registry-server" containerID="cri-o://81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38" gracePeriod=2 Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.841635 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.914751 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content\") pod \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.914823 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhlx\" (UniqueName: \"kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx\") pod \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.914883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities\") pod \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\" (UID: \"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77\") " Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.916147 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities" (OuterVolumeSpecName: "utilities") pod "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" (UID: "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:07:26 crc kubenswrapper[4810]: I1003 08:07:26.921509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx" (OuterVolumeSpecName: "kube-api-access-kwhlx") pod "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" (UID: "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77"). InnerVolumeSpecName "kube-api-access-kwhlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.016087 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhlx\" (UniqueName: \"kubernetes.io/projected/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-kube-api-access-kwhlx\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.016132 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.433019 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" (UID: "bcc360e3-34ac-499f-abcb-4eb5c7ea5a77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.468053 4810 generic.go:334] "Generic (PLEG): container finished" podID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerID="81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38" exitCode=0 Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.468106 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5wcs" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.468106 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerDied","Data":"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38"} Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.468213 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5wcs" event={"ID":"bcc360e3-34ac-499f-abcb-4eb5c7ea5a77","Type":"ContainerDied","Data":"1fc55005aacbc5c96bad84dec8673615d1c9fa85b2db8108101b9847764cad24"} Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.468233 4810 scope.go:117] "RemoveContainer" containerID="81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.490188 4810 scope.go:117] "RemoveContainer" containerID="965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.508446 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.514371 4810 scope.go:117] "RemoveContainer" containerID="9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.514624 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5wcs"] Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.523755 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.538777 4810 scope.go:117] "RemoveContainer" containerID="81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38" Oct 03 08:07:27 crc kubenswrapper[4810]: E1003 08:07:27.539289 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38\": container with ID starting with 81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38 not found: ID does not exist" containerID="81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.539404 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38"} err="failed to get container status \"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38\": rpc error: code = NotFound desc = could not find container \"81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38\": container with ID starting with 81db24702ebdc1e38ff020af87b6881cb2cfe9afe9eea63ffd28be489949bf38 not found: ID does not exist" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.539504 4810 scope.go:117] "RemoveContainer" containerID="965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c" Oct 03 08:07:27 crc kubenswrapper[4810]: E1003 08:07:27.539851 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c\": container with ID starting with 965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c not found: ID does not exist" containerID="965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.539918 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c"} err="failed to get container status \"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c\": rpc error: code = NotFound desc = could not find container \"965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c\": container with ID starting with 965d8f80e456004e57aa2079147bbd8d5b9e215e7618185f366a97b6b0c5321c not found: ID does not exist" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.539949 4810 scope.go:117] "RemoveContainer" containerID="9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f" Oct 03 08:07:27 crc kubenswrapper[4810]: E1003 08:07:27.540599 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f\": container with ID starting with 9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f not found: ID does not exist" containerID="9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f" Oct 03 08:07:27 crc kubenswrapper[4810]: I1003 08:07:27.540632 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f"} err="failed to get container status \"9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f\": rpc error: code = NotFound desc = could not find container \"9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f\": container with ID starting with 9e1d715f5ba6f3695c36233819e9fc935f0807f34b3009e91102f742f0e3633f not found: ID does not exist" Oct 03 08:07:29 crc kubenswrapper[4810]: I1003 08:07:29.312670 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" path="/var/lib/kubelet/pods/bcc360e3-34ac-499f-abcb-4eb5c7ea5a77/volumes" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.088459 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.088561 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.088679 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.089487 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.089558 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" gracePeriod=600 Oct 03 08:07:32 crc kubenswrapper[4810]: E1003 08:07:32.223433 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.513023 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" exitCode=0 Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.513074 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54"} Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.513120 4810 scope.go:117] "RemoveContainer" containerID="d1c3e49be019ecfdc77fec141760a7980e042866f0e858121150b41fa1aa9ec7" Oct 03 08:07:32 crc kubenswrapper[4810]: I1003 08:07:32.514042 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:07:32 crc kubenswrapper[4810]: E1003 08:07:32.514293 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:07:44 crc kubenswrapper[4810]: I1003 08:07:44.303567 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:07:44 crc kubenswrapper[4810]: E1003 08:07:44.304530 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:07:55 crc kubenswrapper[4810]: I1003 08:07:55.303150 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:07:55 crc kubenswrapper[4810]: E1003 08:07:55.304088 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:08:08 crc kubenswrapper[4810]: I1003 08:08:08.302530 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:08:08 crc kubenswrapper[4810]: E1003 08:08:08.303311 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:08:20 crc kubenswrapper[4810]: I1003 08:08:20.303356 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:08:20 crc kubenswrapper[4810]: E1003 08:08:20.304293 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:08:31 crc kubenswrapper[4810]: I1003 08:08:31.302772 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:08:31 crc kubenswrapper[4810]: E1003 08:08:31.303706 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:08:44 crc kubenswrapper[4810]: I1003 08:08:44.302355 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:08:44 crc kubenswrapper[4810]: E1003 08:08:44.303224 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:08:57 crc kubenswrapper[4810]: I1003 08:08:57.310069 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:08:57 crc kubenswrapper[4810]: E1003 08:08:57.311339 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:09:09 crc kubenswrapper[4810]: I1003 08:09:09.303868 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:09:09 crc kubenswrapper[4810]: E1003 08:09:09.304571 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:09:22 crc kubenswrapper[4810]: I1003 08:09:22.303023 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:09:22 crc kubenswrapper[4810]: E1003 08:09:22.304151 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:09:34 crc kubenswrapper[4810]: I1003 08:09:34.302856 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:09:34 crc kubenswrapper[4810]: E1003 08:09:34.303766 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:09:46 crc kubenswrapper[4810]: I1003 08:09:46.302643 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:09:46 crc kubenswrapper[4810]: E1003 08:09:46.303407 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:10:01 crc kubenswrapper[4810]: I1003 08:10:01.303619 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:10:01 crc kubenswrapper[4810]: E1003 08:10:01.304741 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:10:13 crc kubenswrapper[4810]: I1003 08:10:13.302028 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:10:13 crc kubenswrapper[4810]: E1003 08:10:13.303453 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:10:26 crc kubenswrapper[4810]: I1003 08:10:26.303118 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:10:26 crc kubenswrapper[4810]: E1003 08:10:26.303878 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:10:38 crc kubenswrapper[4810]: I1003 08:10:38.302539 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:10:38 crc kubenswrapper[4810]: E1003 08:10:38.303275 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:10:52 crc kubenswrapper[4810]: I1003 08:10:52.303407 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:10:52 crc kubenswrapper[4810]: E1003 08:10:52.304095 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:11:04 crc kubenswrapper[4810]: I1003 08:11:04.302653 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:11:04 crc kubenswrapper[4810]: E1003 08:11:04.304155 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:11:17 crc kubenswrapper[4810]: I1003 08:11:17.307708 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:11:17 crc kubenswrapper[4810]: E1003 08:11:17.310328 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.497227 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:20 crc kubenswrapper[4810]: E1003 08:11:20.497781 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="extract-content" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.497792 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="extract-content" Oct 03 08:11:20 crc kubenswrapper[4810]: E1003 08:11:20.497808 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="registry-server" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.497814 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="registry-server" Oct 03 08:11:20 crc kubenswrapper[4810]: E1003 08:11:20.497832 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="extract-utilities" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.497839 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="extract-utilities" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.497980 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc360e3-34ac-499f-abcb-4eb5c7ea5a77" containerName="registry-server" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.498936 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.517238 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.574114 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.574169 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.574207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfp8x\" (UniqueName: \"kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.675980 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.676072 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.676127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfp8x\" (UniqueName: \"kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.678632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.680866 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.710492 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfp8x\" (UniqueName: \"kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x\") pod \"redhat-marketplace-4rtsp\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:20 crc kubenswrapper[4810]: I1003 08:11:20.833656 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:21 crc kubenswrapper[4810]: I1003 08:11:21.310204 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:21 crc kubenswrapper[4810]: I1003 08:11:21.472418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerStarted","Data":"4edac27189a507f2b52c4ba7f3d833fbbb223d1a2de6b044b4918a3429da3afc"} Oct 03 08:11:22 crc kubenswrapper[4810]: I1003 08:11:22.481155 4810 generic.go:334] "Generic (PLEG): container finished" podID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerID="6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6" exitCode=0 Oct 03 08:11:22 crc kubenswrapper[4810]: I1003 08:11:22.481222 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerDied","Data":"6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6"} Oct 03 08:11:22 crc kubenswrapper[4810]: I1003 08:11:22.483737 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:11:24 crc kubenswrapper[4810]: I1003 08:11:24.496353 4810 generic.go:334] "Generic (PLEG): container finished" podID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerID="cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974" exitCode=0 Oct 03 08:11:24 crc kubenswrapper[4810]: I1003 08:11:24.496395 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerDied","Data":"cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974"} Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.490088 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.492116 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.504765 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.511989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerStarted","Data":"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214"} Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.554812 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.555200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.555331 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftkj\" (UniqueName: \"kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.561135 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rtsp" podStartSLOduration=3.12607837 podStartE2EDuration="5.561115364s" podCreationTimestamp="2025-10-03 08:11:20 +0000 UTC" firstStartedPulling="2025-10-03 08:11:22.483492798 +0000 UTC m=+4515.910743533" lastFinishedPulling="2025-10-03 08:11:24.918529782 +0000 UTC m=+4518.345780527" observedRunningTime="2025-10-03 08:11:25.554995183 +0000 UTC m=+4518.982245928" watchObservedRunningTime="2025-10-03 08:11:25.561115364 +0000 UTC m=+4518.988366099" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.656622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.656690 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.656721 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mftkj\" (UniqueName: \"kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.657196 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.657613 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.698378 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftkj\" (UniqueName: \"kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj\") pod \"certified-operators-xdt9z\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:25 crc kubenswrapper[4810]: I1003 08:11:25.812190 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:26 crc kubenswrapper[4810]: I1003 08:11:26.102225 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:26 crc kubenswrapper[4810]: I1003 08:11:26.526642 4810 generic.go:334] "Generic (PLEG): container finished" podID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerID="a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751" exitCode=0 Oct 03 08:11:26 crc kubenswrapper[4810]: I1003 08:11:26.526769 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerDied","Data":"a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751"} Oct 03 08:11:26 crc kubenswrapper[4810]: I1003 08:11:26.527279 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerStarted","Data":"8d65fc5f64cf1430ba9ef249f04a59136895e2414f4519c49cd438cfbb924e7c"} Oct 03 08:11:28 crc kubenswrapper[4810]: I1003 08:11:28.549179 4810 generic.go:334] "Generic (PLEG): container finished" podID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerID="0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754" exitCode=0 Oct 03 08:11:28 crc kubenswrapper[4810]: I1003 08:11:28.549748 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerDied","Data":"0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754"} Oct 03 08:11:29 crc kubenswrapper[4810]: I1003 08:11:29.560852 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerStarted","Data":"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1"} Oct 03 08:11:29 crc kubenswrapper[4810]: I1003 08:11:29.588112 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdt9z" podStartSLOduration=1.8142740000000002 podStartE2EDuration="4.588096634s" podCreationTimestamp="2025-10-03 08:11:25 +0000 UTC" firstStartedPulling="2025-10-03 08:11:26.528150789 +0000 UTC m=+4519.955401524" lastFinishedPulling="2025-10-03 08:11:29.301973393 +0000 UTC m=+4522.729224158" observedRunningTime="2025-10-03 08:11:29.582157099 +0000 UTC m=+4523.009407844" watchObservedRunningTime="2025-10-03 08:11:29.588096634 +0000 UTC m=+4523.015347359" Oct 03 08:11:30 crc kubenswrapper[4810]: I1003 08:11:30.835337 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:30 crc kubenswrapper[4810]: I1003 08:11:30.835724 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:30 crc kubenswrapper[4810]: I1003 08:11:30.885848 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:31 crc kubenswrapper[4810]: I1003 08:11:31.618865 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:32 crc kubenswrapper[4810]: I1003 08:11:32.303379 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:11:32 crc kubenswrapper[4810]: E1003 08:11:32.303979 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:11:32 crc kubenswrapper[4810]: I1003 08:11:32.874198 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:33 crc kubenswrapper[4810]: I1003 08:11:33.593803 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rtsp" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="registry-server" containerID="cri-o://08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214" gracePeriod=2 Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.068385 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.182531 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities\") pod \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.182589 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfp8x\" (UniqueName: \"kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x\") pod \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.182674 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content\") pod \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\" (UID: \"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b\") " Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.183451 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities" (OuterVolumeSpecName: "utilities") pod "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" (UID: "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.196126 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x" (OuterVolumeSpecName: "kube-api-access-sfp8x") pod "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" (UID: "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b"). InnerVolumeSpecName "kube-api-access-sfp8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.198146 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" (UID: "095d602d-0c54-4f23-ba09-b5b9c6bd1e9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.284648 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.284686 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfp8x\" (UniqueName: \"kubernetes.io/projected/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-kube-api-access-sfp8x\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.284696 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.602337 4810 generic.go:334] "Generic (PLEG): container finished" podID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerID="08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214" exitCode=0 Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.602380 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerDied","Data":"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214"} Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.602406 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rtsp" event={"ID":"095d602d-0c54-4f23-ba09-b5b9c6bd1e9b","Type":"ContainerDied","Data":"4edac27189a507f2b52c4ba7f3d833fbbb223d1a2de6b044b4918a3429da3afc"} Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.602423 4810 scope.go:117] "RemoveContainer" containerID="08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.602566 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rtsp" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.627522 4810 scope.go:117] "RemoveContainer" containerID="cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.637560 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.642731 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rtsp"] Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.653651 4810 scope.go:117] "RemoveContainer" containerID="6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.674739 4810 scope.go:117] "RemoveContainer" containerID="08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214" Oct 03 08:11:34 crc kubenswrapper[4810]: E1003 08:11:34.675229 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214\": container with ID starting with 08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214 not found: ID does not exist" containerID="08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.675411 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214"} err="failed to get container status \"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214\": rpc error: code = NotFound desc = could not find container \"08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214\": container with ID starting with 08e0c77d94cb3e875c4a0ce45f172b3a19d1d0e752099c1a4af6cfe6c43cb214 not found: ID does not exist" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.675529 4810 scope.go:117] "RemoveContainer" containerID="cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974" Oct 03 08:11:34 crc kubenswrapper[4810]: E1003 08:11:34.675937 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974\": container with ID starting with cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974 not found: ID does not exist" containerID="cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.676049 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974"} err="failed to get container status \"cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974\": rpc error: code = NotFound desc = could not find container \"cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974\": container with ID starting with cd44ffe82164470fcb7c8f3850a1cd4d29e6e1a20fef3d69b581bd458539e974 not found: ID does not exist" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.676133 4810 scope.go:117] "RemoveContainer" containerID="6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6" Oct 03 08:11:34 crc kubenswrapper[4810]: E1003 08:11:34.676494 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6\": container with ID starting with 6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6 not found: ID does not exist" containerID="6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6" Oct 03 08:11:34 crc kubenswrapper[4810]: I1003 08:11:34.676539 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6"} err="failed to get container status \"6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6\": rpc error: code = NotFound desc = could not find container \"6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6\": container with ID starting with 6fc4fd4e392fff124c864e01ee8c4b4882739d2922b6831b4026c116a7f4e6d6 not found: ID does not exist" Oct 03 08:11:35 crc kubenswrapper[4810]: I1003 08:11:35.320433 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" path="/var/lib/kubelet/pods/095d602d-0c54-4f23-ba09-b5b9c6bd1e9b/volumes" Oct 03 08:11:35 crc kubenswrapper[4810]: I1003 08:11:35.812691 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:35 crc kubenswrapper[4810]: I1003 08:11:35.812738 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:35 crc kubenswrapper[4810]: I1003 08:11:35.881128 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:36 crc kubenswrapper[4810]: I1003 08:11:36.657541 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:37 crc kubenswrapper[4810]: I1003 08:11:37.075639 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:38 crc kubenswrapper[4810]: I1003 08:11:38.631947 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdt9z" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="registry-server" containerID="cri-o://d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1" gracePeriod=2 Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.033975 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.151927 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities\") pod \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.152337 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content\") pod \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.152434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mftkj\" (UniqueName: \"kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj\") pod \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\" (UID: \"bebfbb7a-e19c-4430-b3d6-bcaa297b959b\") " Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.155990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities" (OuterVolumeSpecName: "utilities") pod "bebfbb7a-e19c-4430-b3d6-bcaa297b959b" (UID: "bebfbb7a-e19c-4430-b3d6-bcaa297b959b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.159183 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj" (OuterVolumeSpecName: "kube-api-access-mftkj") pod "bebfbb7a-e19c-4430-b3d6-bcaa297b959b" (UID: "bebfbb7a-e19c-4430-b3d6-bcaa297b959b"). InnerVolumeSpecName "kube-api-access-mftkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.203376 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bebfbb7a-e19c-4430-b3d6-bcaa297b959b" (UID: "bebfbb7a-e19c-4430-b3d6-bcaa297b959b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.255019 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.255064 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.255081 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mftkj\" (UniqueName: \"kubernetes.io/projected/bebfbb7a-e19c-4430-b3d6-bcaa297b959b-kube-api-access-mftkj\") on node \"crc\" DevicePath \"\"" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.640175 4810 generic.go:334] "Generic (PLEG): container finished" podID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerID="d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1" exitCode=0 Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.640227 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerDied","Data":"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1"} Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.640268 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdt9z" event={"ID":"bebfbb7a-e19c-4430-b3d6-bcaa297b959b","Type":"ContainerDied","Data":"8d65fc5f64cf1430ba9ef249f04a59136895e2414f4519c49cd438cfbb924e7c"} Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.640290 4810 scope.go:117] "RemoveContainer" containerID="d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.640281 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdt9z" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.661412 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.661706 4810 scope.go:117] "RemoveContainer" containerID="0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.668762 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdt9z"] Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.677872 4810 scope.go:117] "RemoveContainer" containerID="a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.707753 4810 scope.go:117] "RemoveContainer" containerID="d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1" Oct 03 08:11:39 crc kubenswrapper[4810]: E1003 08:11:39.708193 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1\": container with ID starting with d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1 not found: ID does not exist" containerID="d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.708224 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1"} err="failed to get container status \"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1\": rpc error: code = NotFound desc = could not find container \"d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1\": container with ID starting with d9cbbf64ffe5b91ab48c591bf8e52a08b49a22e68fb1a1378a70261b87f5c7e1 not found: ID does not exist" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.708243 4810 scope.go:117] "RemoveContainer" containerID="0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754" Oct 03 08:11:39 crc kubenswrapper[4810]: E1003 08:11:39.708962 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754\": container with ID starting with 0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754 not found: ID does not exist" containerID="0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.708986 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754"} err="failed to get container status \"0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754\": rpc error: code = NotFound desc = could not find container \"0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754\": container with ID starting with 0d9cd2c5f6137a939e056f29f59cbc7b032bab789c08229c4250ce4dc188a754 not found: ID does not exist" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.709004 4810 scope.go:117] "RemoveContainer" containerID="a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751" Oct 03 08:11:39 crc kubenswrapper[4810]: E1003 08:11:39.709499 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751\": container with ID starting with a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751 not found: ID does not exist" containerID="a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751" Oct 03 08:11:39 crc kubenswrapper[4810]: I1003 08:11:39.709560 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751"} err="failed to get container status \"a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751\": rpc error: code = NotFound desc = could not find container \"a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751\": container with ID starting with a7bce84780b9edd7607eeedaebcef0bcf6c1a3926205e7b989d42dbc9549b751 not found: ID does not exist" Oct 03 08:11:41 crc kubenswrapper[4810]: I1003 08:11:41.317494 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" path="/var/lib/kubelet/pods/bebfbb7a-e19c-4430-b3d6-bcaa297b959b/volumes" Oct 03 08:11:47 crc kubenswrapper[4810]: I1003 08:11:47.309998 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:11:47 crc kubenswrapper[4810]: E1003 08:11:47.311182 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:11:58 crc kubenswrapper[4810]: I1003 08:11:58.302791 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:11:58 crc kubenswrapper[4810]: E1003 08:11:58.304306 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:12:10 crc kubenswrapper[4810]: I1003 08:12:10.302718 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:12:10 crc kubenswrapper[4810]: E1003 08:12:10.303578 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:12:25 crc kubenswrapper[4810]: I1003 08:12:25.302576 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:12:25 crc kubenswrapper[4810]: E1003 08:12:25.304060 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:12:36 crc kubenswrapper[4810]: I1003 08:12:36.303149 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:12:37 crc kubenswrapper[4810]: I1003 08:12:37.082076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9"} Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.701490 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.703596 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.703701 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.703802 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="extract-utilities" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.703877 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="extract-utilities" Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.704004 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704080 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.704169 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="extract-content" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704245 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="extract-content" Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.704338 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="extract-content" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704415 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="extract-content" Oct 03 08:13:23 crc kubenswrapper[4810]: E1003 08:13:23.704504 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="extract-utilities" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704599 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="extract-utilities" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704851 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="095d602d-0c54-4f23-ba09-b5b9c6bd1e9b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.704967 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebfbb7a-e19c-4430-b3d6-bcaa297b959b" containerName="registry-server" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.706389 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.713079 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.895922 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.895974 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.896023 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptp94\" (UniqueName: \"kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.997348 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.997417 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.997463 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptp94\" (UniqueName: \"kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.998329 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:23 crc kubenswrapper[4810]: I1003 08:13:23.998670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:24 crc kubenswrapper[4810]: I1003 08:13:24.223975 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptp94\" (UniqueName: \"kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94\") pod \"community-operators-ff479\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:24 crc kubenswrapper[4810]: I1003 08:13:24.333362 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:24 crc kubenswrapper[4810]: I1003 08:13:24.797516 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:25 crc kubenswrapper[4810]: I1003 08:13:25.455506 4810 generic.go:334] "Generic (PLEG): container finished" podID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerID="4bb8e1a9f899c913b3fd77ac8e996dabfd8190942ab0d82f3af2fdf6198c00d9" exitCode=0 Oct 03 08:13:25 crc kubenswrapper[4810]: I1003 08:13:25.455595 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerDied","Data":"4bb8e1a9f899c913b3fd77ac8e996dabfd8190942ab0d82f3af2fdf6198c00d9"} Oct 03 08:13:25 crc kubenswrapper[4810]: I1003 08:13:25.455927 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerStarted","Data":"414e379d26d76986e7c4a588604d9f145eb070d6e2d96aa0985fecdb8cd31ca2"} Oct 03 08:13:26 crc kubenswrapper[4810]: I1003 08:13:26.465675 4810 generic.go:334] "Generic (PLEG): container finished" podID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerID="f66bd455953b9786c842209fc4a594a73622794ab61bbc1809feea8d5c7f6774" exitCode=0 Oct 03 08:13:26 crc kubenswrapper[4810]: I1003 08:13:26.465818 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerDied","Data":"f66bd455953b9786c842209fc4a594a73622794ab61bbc1809feea8d5c7f6774"} Oct 03 08:13:27 crc kubenswrapper[4810]: I1003 08:13:27.475752 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerStarted","Data":"6a030a8561109fb3f0249a3ebe2c9b0b8145af4d1d4f4a3f602f4c16f30ed061"} Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.335029 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.335543 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.393633 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.415165 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ff479" podStartSLOduration=9.867715092 podStartE2EDuration="11.415144311s" podCreationTimestamp="2025-10-03 08:13:23 +0000 UTC" firstStartedPulling="2025-10-03 08:13:25.457134605 +0000 UTC m=+4638.884385340" lastFinishedPulling="2025-10-03 08:13:27.004563824 +0000 UTC m=+4640.431814559" observedRunningTime="2025-10-03 08:13:27.496070851 +0000 UTC m=+4640.923321586" watchObservedRunningTime="2025-10-03 08:13:34.415144311 +0000 UTC m=+4647.842395046" Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.573530 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:34 crc kubenswrapper[4810]: I1003 08:13:34.632871 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:36 crc kubenswrapper[4810]: I1003 08:13:36.539770 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ff479" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="registry-server" containerID="cri-o://6a030a8561109fb3f0249a3ebe2c9b0b8145af4d1d4f4a3f602f4c16f30ed061" gracePeriod=2 Oct 03 08:13:37 crc kubenswrapper[4810]: I1003 08:13:37.552488 4810 generic.go:334] "Generic (PLEG): container finished" podID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerID="6a030a8561109fb3f0249a3ebe2c9b0b8145af4d1d4f4a3f602f4c16f30ed061" exitCode=0 Oct 03 08:13:37 crc kubenswrapper[4810]: I1003 08:13:37.552597 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerDied","Data":"6a030a8561109fb3f0249a3ebe2c9b0b8145af4d1d4f4a3f602f4c16f30ed061"} Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.826142 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.913888 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content\") pod \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.914039 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptp94\" (UniqueName: \"kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94\") pod \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.914108 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities\") pod \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\" (UID: \"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd\") " Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.914979 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities" (OuterVolumeSpecName: "utilities") pod "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" (UID: "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:13:38 crc kubenswrapper[4810]: I1003 08:13:38.918668 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94" (OuterVolumeSpecName: "kube-api-access-ptp94") pod "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" (UID: "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd"). InnerVolumeSpecName "kube-api-access-ptp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.016080 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.016107 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptp94\" (UniqueName: \"kubernetes.io/projected/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-kube-api-access-ptp94\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.566562 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" (UID: "b2b77709-e2a4-467c-ab8c-d3557ea8ffbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.585110 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ff479" event={"ID":"b2b77709-e2a4-467c-ab8c-d3557ea8ffbd","Type":"ContainerDied","Data":"414e379d26d76986e7c4a588604d9f145eb070d6e2d96aa0985fecdb8cd31ca2"} Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.585201 4810 scope.go:117] "RemoveContainer" containerID="6a030a8561109fb3f0249a3ebe2c9b0b8145af4d1d4f4a3f602f4c16f30ed061" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.585215 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ff479" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.623503 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.625387 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.629771 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ff479"] Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.633784 4810 scope.go:117] "RemoveContainer" containerID="f66bd455953b9786c842209fc4a594a73622794ab61bbc1809feea8d5c7f6774" Oct 03 08:13:39 crc kubenswrapper[4810]: I1003 08:13:39.653124 4810 scope.go:117] "RemoveContainer" containerID="4bb8e1a9f899c913b3fd77ac8e996dabfd8190942ab0d82f3af2fdf6198c00d9" Oct 03 08:13:41 crc kubenswrapper[4810]: I1003 08:13:41.312307 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" path="/var/lib/kubelet/pods/b2b77709-e2a4-467c-ab8c-d3557ea8ffbd/volumes" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.138147 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn"] Oct 03 08:15:00 crc kubenswrapper[4810]: E1003 08:15:00.140984 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="extract-utilities" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.141082 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="extract-utilities" Oct 03 08:15:00 crc kubenswrapper[4810]: E1003 08:15:00.141145 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="registry-server" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.141201 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="registry-server" Oct 03 08:15:00 crc kubenswrapper[4810]: E1003 08:15:00.141283 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="extract-content" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.141339 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="extract-content" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.141555 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b77709-e2a4-467c-ab8c-d3557ea8ffbd" containerName="registry-server" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.142118 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.145451 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.145723 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.154716 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn"] Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.192180 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.192259 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdwr5\" (UniqueName: \"kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.192294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.293415 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.293497 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.293549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdwr5\" (UniqueName: \"kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.295273 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.307415 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.308930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdwr5\" (UniqueName: \"kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5\") pod \"collect-profiles-29324655-mpsfn\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.467186 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:00 crc kubenswrapper[4810]: I1003 08:15:00.903714 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn"] Oct 03 08:15:01 crc kubenswrapper[4810]: I1003 08:15:01.261636 4810 generic.go:334] "Generic (PLEG): container finished" podID="9c14d633-6153-4c93-acaf-e8220d5e601b" containerID="29e1c159ec9e5d74ee70771b08953deee185a0b89e454f44ac410ea51d656962" exitCode=0 Oct 03 08:15:01 crc kubenswrapper[4810]: I1003 08:15:01.261690 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" event={"ID":"9c14d633-6153-4c93-acaf-e8220d5e601b","Type":"ContainerDied","Data":"29e1c159ec9e5d74ee70771b08953deee185a0b89e454f44ac410ea51d656962"} Oct 03 08:15:01 crc kubenswrapper[4810]: I1003 08:15:01.261736 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" event={"ID":"9c14d633-6153-4c93-acaf-e8220d5e601b","Type":"ContainerStarted","Data":"8bded89d62a42cb2a0627745807987672ffc99eaeb06b65efeb3c5f209f476bc"} Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.088516 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.088861 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.557630 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.627094 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume\") pod \"9c14d633-6153-4c93-acaf-e8220d5e601b\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.627211 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume\") pod \"9c14d633-6153-4c93-acaf-e8220d5e601b\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.627270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdwr5\" (UniqueName: \"kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5\") pod \"9c14d633-6153-4c93-acaf-e8220d5e601b\" (UID: \"9c14d633-6153-4c93-acaf-e8220d5e601b\") " Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.628040 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c14d633-6153-4c93-acaf-e8220d5e601b" (UID: "9c14d633-6153-4c93-acaf-e8220d5e601b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.635929 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9c14d633-6153-4c93-acaf-e8220d5e601b" (UID: "9c14d633-6153-4c93-acaf-e8220d5e601b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.637578 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5" (OuterVolumeSpecName: "kube-api-access-sdwr5") pod "9c14d633-6153-4c93-acaf-e8220d5e601b" (UID: "9c14d633-6153-4c93-acaf-e8220d5e601b"). InnerVolumeSpecName "kube-api-access-sdwr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.729019 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c14d633-6153-4c93-acaf-e8220d5e601b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.729057 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c14d633-6153-4c93-acaf-e8220d5e601b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:02 crc kubenswrapper[4810]: I1003 08:15:02.729067 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdwr5\" (UniqueName: \"kubernetes.io/projected/9c14d633-6153-4c93-acaf-e8220d5e601b-kube-api-access-sdwr5\") on node \"crc\" DevicePath \"\"" Oct 03 08:15:03 crc kubenswrapper[4810]: I1003 08:15:03.280476 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" Oct 03 08:15:03 crc kubenswrapper[4810]: I1003 08:15:03.280453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn" event={"ID":"9c14d633-6153-4c93-acaf-e8220d5e601b","Type":"ContainerDied","Data":"8bded89d62a42cb2a0627745807987672ffc99eaeb06b65efeb3c5f209f476bc"} Oct 03 08:15:03 crc kubenswrapper[4810]: I1003 08:15:03.280605 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bded89d62a42cb2a0627745807987672ffc99eaeb06b65efeb3c5f209f476bc" Oct 03 08:15:03 crc kubenswrapper[4810]: I1003 08:15:03.638319 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7"] Oct 03 08:15:03 crc kubenswrapper[4810]: I1003 08:15:03.643361 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324610-jqth7"] Oct 03 08:15:05 crc kubenswrapper[4810]: I1003 08:15:05.313672 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ea9ecb-a1be-4648-84c8-4447f16fba29" path="/var/lib/kubelet/pods/d4ea9ecb-a1be-4648-84c8-4447f16fba29/volumes" Oct 03 08:15:14 crc kubenswrapper[4810]: I1003 08:15:14.760760 4810 scope.go:117] "RemoveContainer" containerID="4b0b2387031e5171b1f3fa7f91f2122286ae9047ccad19cffef3595dfe15a459" Oct 03 08:15:32 crc kubenswrapper[4810]: I1003 08:15:32.088857 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:15:32 crc kubenswrapper[4810]: I1003 08:15:32.089335 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.089403 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.090354 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.090416 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.091261 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.091333 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9" gracePeriod=600 Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.791147 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9" exitCode=0 Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.791206 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9"} Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.791416 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc"} Oct 03 08:16:02 crc kubenswrapper[4810]: I1003 08:16:02.791439 4810 scope.go:117] "RemoveContainer" containerID="9acd9d98fd5812320389a2f93aa6a20cad1d77f08b654d3bd3ad5f7a8439ef54" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.129939 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:35 crc kubenswrapper[4810]: E1003 08:17:35.131568 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c14d633-6153-4c93-acaf-e8220d5e601b" containerName="collect-profiles" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.131592 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c14d633-6153-4c93-acaf-e8220d5e601b" containerName="collect-profiles" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.131934 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c14d633-6153-4c93-acaf-e8220d5e601b" containerName="collect-profiles" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.134147 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.148664 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.271958 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgmcp\" (UniqueName: \"kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.272006 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.272024 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.374419 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgmcp\" (UniqueName: \"kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.374994 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.375150 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.375607 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.375735 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.399256 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgmcp\" (UniqueName: \"kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp\") pod \"redhat-operators-q5hxh\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.459842 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:35 crc kubenswrapper[4810]: I1003 08:17:35.901107 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:36 crc kubenswrapper[4810]: I1003 08:17:36.647978 4810 generic.go:334] "Generic (PLEG): container finished" podID="ff346b80-751c-4008-88ce-ecb67af277cc" containerID="a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982" exitCode=0 Oct 03 08:17:36 crc kubenswrapper[4810]: I1003 08:17:36.648050 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerDied","Data":"a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982"} Oct 03 08:17:36 crc kubenswrapper[4810]: I1003 08:17:36.648327 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerStarted","Data":"1cff1325efa3d64b3d2d4c35ac41e68fd597cf630f9326e8f33456eff34e09b3"} Oct 03 08:17:36 crc kubenswrapper[4810]: I1003 08:17:36.650942 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:17:38 crc kubenswrapper[4810]: I1003 08:17:38.670109 4810 generic.go:334] "Generic (PLEG): container finished" podID="ff346b80-751c-4008-88ce-ecb67af277cc" containerID="b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6" exitCode=0 Oct 03 08:17:38 crc kubenswrapper[4810]: I1003 08:17:38.670228 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerDied","Data":"b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6"} Oct 03 08:17:39 crc kubenswrapper[4810]: I1003 08:17:39.682076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerStarted","Data":"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96"} Oct 03 08:17:39 crc kubenswrapper[4810]: I1003 08:17:39.703594 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q5hxh" podStartSLOduration=2.131573126 podStartE2EDuration="4.703572796s" podCreationTimestamp="2025-10-03 08:17:35 +0000 UTC" firstStartedPulling="2025-10-03 08:17:36.650618923 +0000 UTC m=+4890.077869658" lastFinishedPulling="2025-10-03 08:17:39.222618593 +0000 UTC m=+4892.649869328" observedRunningTime="2025-10-03 08:17:39.703048803 +0000 UTC m=+4893.130299538" watchObservedRunningTime="2025-10-03 08:17:39.703572796 +0000 UTC m=+4893.130823531" Oct 03 08:17:45 crc kubenswrapper[4810]: I1003 08:17:45.460868 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:45 crc kubenswrapper[4810]: I1003 08:17:45.461466 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:45 crc kubenswrapper[4810]: I1003 08:17:45.522991 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:45 crc kubenswrapper[4810]: I1003 08:17:45.775160 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:45 crc kubenswrapper[4810]: I1003 08:17:45.828019 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:47 crc kubenswrapper[4810]: I1003 08:17:47.744938 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q5hxh" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="registry-server" containerID="cri-o://0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96" gracePeriod=2 Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.195117 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.289883 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities\") pod \"ff346b80-751c-4008-88ce-ecb67af277cc\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.290032 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content\") pod \"ff346b80-751c-4008-88ce-ecb67af277cc\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.290096 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgmcp\" (UniqueName: \"kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp\") pod \"ff346b80-751c-4008-88ce-ecb67af277cc\" (UID: \"ff346b80-751c-4008-88ce-ecb67af277cc\") " Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.291310 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities" (OuterVolumeSpecName: "utilities") pod "ff346b80-751c-4008-88ce-ecb67af277cc" (UID: "ff346b80-751c-4008-88ce-ecb67af277cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.299230 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp" (OuterVolumeSpecName: "kube-api-access-lgmcp") pod "ff346b80-751c-4008-88ce-ecb67af277cc" (UID: "ff346b80-751c-4008-88ce-ecb67af277cc"). InnerVolumeSpecName "kube-api-access-lgmcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.391979 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgmcp\" (UniqueName: \"kubernetes.io/projected/ff346b80-751c-4008-88ce-ecb67af277cc-kube-api-access-lgmcp\") on node \"crc\" DevicePath \"\"" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.392334 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.755180 4810 generic.go:334] "Generic (PLEG): container finished" podID="ff346b80-751c-4008-88ce-ecb67af277cc" containerID="0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96" exitCode=0 Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.755243 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerDied","Data":"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96"} Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.755263 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q5hxh" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.755284 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q5hxh" event={"ID":"ff346b80-751c-4008-88ce-ecb67af277cc","Type":"ContainerDied","Data":"1cff1325efa3d64b3d2d4c35ac41e68fd597cf630f9326e8f33456eff34e09b3"} Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.755315 4810 scope.go:117] "RemoveContainer" containerID="0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.784290 4810 scope.go:117] "RemoveContainer" containerID="b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.809565 4810 scope.go:117] "RemoveContainer" containerID="a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.840752 4810 scope.go:117] "RemoveContainer" containerID="0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96" Oct 03 08:17:48 crc kubenswrapper[4810]: E1003 08:17:48.841409 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96\": container with ID starting with 0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96 not found: ID does not exist" containerID="0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.841442 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96"} err="failed to get container status \"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96\": rpc error: code = NotFound desc = could not find container \"0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96\": container with ID starting with 0ad9bace86f6d527fb9410c71c86b0930914bb0ae9d205e9aadeb88094194c96 not found: ID does not exist" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.841464 4810 scope.go:117] "RemoveContainer" containerID="b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6" Oct 03 08:17:48 crc kubenswrapper[4810]: E1003 08:17:48.842085 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6\": container with ID starting with b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6 not found: ID does not exist" containerID="b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.842125 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6"} err="failed to get container status \"b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6\": rpc error: code = NotFound desc = could not find container \"b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6\": container with ID starting with b9cee07edb309fdfae79a783422d884c91b8c79f138bfdd56191b9029dee20c6 not found: ID does not exist" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.842147 4810 scope.go:117] "RemoveContainer" containerID="a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982" Oct 03 08:17:48 crc kubenswrapper[4810]: E1003 08:17:48.842880 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982\": container with ID starting with a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982 not found: ID does not exist" containerID="a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982" Oct 03 08:17:48 crc kubenswrapper[4810]: I1003 08:17:48.842943 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982"} err="failed to get container status \"a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982\": rpc error: code = NotFound desc = could not find container \"a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982\": container with ID starting with a73cc1db3560aeba8eefabd0622caf3926e66f483dddf73bc9f58c4ca3147982 not found: ID does not exist" Oct 03 08:17:49 crc kubenswrapper[4810]: I1003 08:17:49.195204 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff346b80-751c-4008-88ce-ecb67af277cc" (UID: "ff346b80-751c-4008-88ce-ecb67af277cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:17:49 crc kubenswrapper[4810]: I1003 08:17:49.203560 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff346b80-751c-4008-88ce-ecb67af277cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:17:49 crc kubenswrapper[4810]: I1003 08:17:49.388064 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:49 crc kubenswrapper[4810]: I1003 08:17:49.394987 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q5hxh"] Oct 03 08:17:51 crc kubenswrapper[4810]: I1003 08:17:51.311304 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" path="/var/lib/kubelet/pods/ff346b80-751c-4008-88ce-ecb67af277cc/volumes" Oct 03 08:18:02 crc kubenswrapper[4810]: I1003 08:18:02.088635 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:18:02 crc kubenswrapper[4810]: I1003 08:18:02.089801 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:18:32 crc kubenswrapper[4810]: I1003 08:18:32.088518 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:18:32 crc kubenswrapper[4810]: I1003 08:18:32.089114 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.088780 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.089539 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.089627 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.090687 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.090811 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" gracePeriod=600 Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.312700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc"} Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.312777 4810 scope.go:117] "RemoveContainer" containerID="9a4b110409b6e4769a2f7ef3362a558c91104a9b13030e5e90efa7e53d112ec9" Oct 03 08:19:02 crc kubenswrapper[4810]: I1003 08:19:02.312663 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" exitCode=0 Oct 03 08:19:02 crc kubenswrapper[4810]: E1003 08:19:02.888069 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:19:03 crc kubenswrapper[4810]: I1003 08:19:03.323541 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:19:03 crc kubenswrapper[4810]: E1003 08:19:03.323994 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:19:14 crc kubenswrapper[4810]: I1003 08:19:14.302724 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:19:14 crc kubenswrapper[4810]: E1003 08:19:14.304469 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:19:25 crc kubenswrapper[4810]: I1003 08:19:25.302096 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:19:25 crc kubenswrapper[4810]: E1003 08:19:25.302790 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:19:39 crc kubenswrapper[4810]: I1003 08:19:39.303378 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:19:39 crc kubenswrapper[4810]: E1003 08:19:39.304400 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:19:53 crc kubenswrapper[4810]: I1003 08:19:53.301982 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:19:53 crc kubenswrapper[4810]: E1003 08:19:53.302768 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:20:05 crc kubenswrapper[4810]: I1003 08:20:05.304210 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:20:05 crc kubenswrapper[4810]: E1003 08:20:05.306840 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:20:19 crc kubenswrapper[4810]: I1003 08:20:19.303501 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:20:19 crc kubenswrapper[4810]: E1003 08:20:19.304291 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:20:30 crc kubenswrapper[4810]: I1003 08:20:30.302327 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:20:30 crc kubenswrapper[4810]: E1003 08:20:30.303207 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:20:45 crc kubenswrapper[4810]: I1003 08:20:45.302076 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:20:45 crc kubenswrapper[4810]: E1003 08:20:45.302948 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:20:56 crc kubenswrapper[4810]: I1003 08:20:56.302835 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:20:56 crc kubenswrapper[4810]: E1003 08:20:56.303590 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:21:07 crc kubenswrapper[4810]: I1003 08:21:07.308655 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:21:07 crc kubenswrapper[4810]: E1003 08:21:07.309616 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:21:22 crc kubenswrapper[4810]: I1003 08:21:22.302404 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:21:22 crc kubenswrapper[4810]: E1003 08:21:22.303131 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.411810 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:32 crc kubenswrapper[4810]: E1003 08:21:32.412688 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="extract-content" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.412702 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="extract-content" Oct 03 08:21:32 crc kubenswrapper[4810]: E1003 08:21:32.412721 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="extract-utilities" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.412730 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="extract-utilities" Oct 03 08:21:32 crc kubenswrapper[4810]: E1003 08:21:32.412739 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="registry-server" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.412744 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="registry-server" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.412915 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff346b80-751c-4008-88ce-ecb67af277cc" containerName="registry-server" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.413949 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.426259 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.572904 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.572947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.572976 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvpq\" (UniqueName: \"kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.673845 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.673909 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.673934 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvpq\" (UniqueName: \"kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.674403 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.674461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.692180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvpq\" (UniqueName: \"kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq\") pod \"certified-operators-t4qmx\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:32 crc kubenswrapper[4810]: I1003 08:21:32.738762 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:33 crc kubenswrapper[4810]: I1003 08:21:33.005742 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:33 crc kubenswrapper[4810]: I1003 08:21:33.562633 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerID="a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce" exitCode=0 Oct 03 08:21:33 crc kubenswrapper[4810]: I1003 08:21:33.562691 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerDied","Data":"a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce"} Oct 03 08:21:33 crc kubenswrapper[4810]: I1003 08:21:33.563006 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerStarted","Data":"e209e89fe7913c4fa7ea6ea8da9a0437bb0026e19d0cd11e0fc0c455839af3e3"} Oct 03 08:21:34 crc kubenswrapper[4810]: I1003 08:21:34.572188 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerID="8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837" exitCode=0 Oct 03 08:21:34 crc kubenswrapper[4810]: I1003 08:21:34.572244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerDied","Data":"8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837"} Oct 03 08:21:35 crc kubenswrapper[4810]: I1003 08:21:35.584776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerStarted","Data":"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373"} Oct 03 08:21:35 crc kubenswrapper[4810]: I1003 08:21:35.603825 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4qmx" podStartSLOduration=1.933855791 podStartE2EDuration="3.603806087s" podCreationTimestamp="2025-10-03 08:21:32 +0000 UTC" firstStartedPulling="2025-10-03 08:21:33.565870635 +0000 UTC m=+5126.993121410" lastFinishedPulling="2025-10-03 08:21:35.235820931 +0000 UTC m=+5128.663071706" observedRunningTime="2025-10-03 08:21:35.602601075 +0000 UTC m=+5129.029851850" watchObservedRunningTime="2025-10-03 08:21:35.603806087 +0000 UTC m=+5129.031056842" Oct 03 08:21:37 crc kubenswrapper[4810]: I1003 08:21:37.306340 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:21:37 crc kubenswrapper[4810]: E1003 08:21:37.306780 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:21:42 crc kubenswrapper[4810]: I1003 08:21:42.739992 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:42 crc kubenswrapper[4810]: I1003 08:21:42.740519 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:42 crc kubenswrapper[4810]: I1003 08:21:42.815865 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:43 crc kubenswrapper[4810]: I1003 08:21:43.707460 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:43 crc kubenswrapper[4810]: I1003 08:21:43.751297 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:45 crc kubenswrapper[4810]: I1003 08:21:45.664707 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4qmx" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="registry-server" containerID="cri-o://b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373" gracePeriod=2 Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.574018 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.676130 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerID="b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373" exitCode=0 Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.676187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerDied","Data":"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373"} Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.676212 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4qmx" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.676231 4810 scope.go:117] "RemoveContainer" containerID="b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.676219 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4qmx" event={"ID":"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb","Type":"ContainerDied","Data":"e209e89fe7913c4fa7ea6ea8da9a0437bb0026e19d0cd11e0fc0c455839af3e3"} Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.690462 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content\") pod \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.690536 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdvpq\" (UniqueName: \"kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq\") pod \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.690626 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities\") pod \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\" (UID: \"2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb\") " Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.692156 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities" (OuterVolumeSpecName: "utilities") pod "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" (UID: "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.696692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq" (OuterVolumeSpecName: "kube-api-access-kdvpq") pod "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" (UID: "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb"). InnerVolumeSpecName "kube-api-access-kdvpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.700288 4810 scope.go:117] "RemoveContainer" containerID="8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.740957 4810 scope.go:117] "RemoveContainer" containerID="a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.756397 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" (UID: "2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.764619 4810 scope.go:117] "RemoveContainer" containerID="b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373" Oct 03 08:21:46 crc kubenswrapper[4810]: E1003 08:21:46.765294 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373\": container with ID starting with b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373 not found: ID does not exist" containerID="b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.765368 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373"} err="failed to get container status \"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373\": rpc error: code = NotFound desc = could not find container \"b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373\": container with ID starting with b77ae51b76b6d415dfa8e4c2354ecfeb527011051aa472704e668cc698f2f373 not found: ID does not exist" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.765408 4810 scope.go:117] "RemoveContainer" containerID="8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837" Oct 03 08:21:46 crc kubenswrapper[4810]: E1003 08:21:46.765930 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837\": container with ID starting with 8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837 not found: ID does not exist" containerID="8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.765977 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837"} err="failed to get container status \"8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837\": rpc error: code = NotFound desc = could not find container \"8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837\": container with ID starting with 8a34fa426c95c561dfaffb2c9b5ff967ad6e68b0afb030e25096b2b8949c6837 not found: ID does not exist" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.766006 4810 scope.go:117] "RemoveContainer" containerID="a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce" Oct 03 08:21:46 crc kubenswrapper[4810]: E1003 08:21:46.766349 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce\": container with ID starting with a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce not found: ID does not exist" containerID="a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.766478 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce"} err="failed to get container status \"a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce\": rpc error: code = NotFound desc = could not find container \"a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce\": container with ID starting with a1bef6137edfe0ad4e861759b1d746d7045eb94c36a25cfbddcfcc30068786ce not found: ID does not exist" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.792398 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.792452 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:46 crc kubenswrapper[4810]: I1003 08:21:46.792482 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdvpq\" (UniqueName: \"kubernetes.io/projected/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb-kube-api-access-kdvpq\") on node \"crc\" DevicePath \"\"" Oct 03 08:21:47 crc kubenswrapper[4810]: I1003 08:21:47.009534 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:47 crc kubenswrapper[4810]: I1003 08:21:47.014664 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4qmx"] Oct 03 08:21:47 crc kubenswrapper[4810]: I1003 08:21:47.315009 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" path="/var/lib/kubelet/pods/2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb/volumes" Oct 03 08:21:51 crc kubenswrapper[4810]: I1003 08:21:51.302656 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:21:51 crc kubenswrapper[4810]: E1003 08:21:51.303278 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.991815 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:21:59 crc kubenswrapper[4810]: E1003 08:21:59.992567 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="extract-content" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.992579 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="extract-content" Oct 03 08:21:59 crc kubenswrapper[4810]: E1003 08:21:59.992594 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="registry-server" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.992600 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="registry-server" Oct 03 08:21:59 crc kubenswrapper[4810]: E1003 08:21:59.992626 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="extract-utilities" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.992632 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="extract-utilities" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.992782 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8da963-a9ed-4d12-9e2d-6672ae9d6ccb" containerName="registry-server" Oct 03 08:21:59 crc kubenswrapper[4810]: I1003 08:21:59.993748 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.003914 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.176376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzz4\" (UniqueName: \"kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.176749 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.176927 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.277867 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.278023 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.278063 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzz4\" (UniqueName: \"kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.278461 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.278563 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.308801 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzz4\" (UniqueName: \"kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4\") pod \"redhat-marketplace-bwq4f\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.320271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.754965 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:22:00 crc kubenswrapper[4810]: I1003 08:22:00.793473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerStarted","Data":"d242a79f0ba20b64b89138c2b4bc9a9697b52ab7cf41759808868aecc20bdbdb"} Oct 03 08:22:01 crc kubenswrapper[4810]: I1003 08:22:01.804268 4810 generic.go:334] "Generic (PLEG): container finished" podID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerID="d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1" exitCode=0 Oct 03 08:22:01 crc kubenswrapper[4810]: I1003 08:22:01.804367 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerDied","Data":"d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1"} Oct 03 08:22:03 crc kubenswrapper[4810]: I1003 08:22:03.303387 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:22:03 crc kubenswrapper[4810]: E1003 08:22:03.304192 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:22:03 crc kubenswrapper[4810]: I1003 08:22:03.823302 4810 generic.go:334] "Generic (PLEG): container finished" podID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerID="f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349" exitCode=0 Oct 03 08:22:03 crc kubenswrapper[4810]: I1003 08:22:03.823343 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerDied","Data":"f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349"} Oct 03 08:22:05 crc kubenswrapper[4810]: I1003 08:22:05.844134 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerStarted","Data":"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a"} Oct 03 08:22:05 crc kubenswrapper[4810]: I1003 08:22:05.867553 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bwq4f" podStartSLOduration=4.011454819 podStartE2EDuration="6.867535566s" podCreationTimestamp="2025-10-03 08:21:59 +0000 UTC" firstStartedPulling="2025-10-03 08:22:01.806680932 +0000 UTC m=+5155.233931677" lastFinishedPulling="2025-10-03 08:22:04.662761679 +0000 UTC m=+5158.090012424" observedRunningTime="2025-10-03 08:22:05.862210435 +0000 UTC m=+5159.289461160" watchObservedRunningTime="2025-10-03 08:22:05.867535566 +0000 UTC m=+5159.294786321" Oct 03 08:22:10 crc kubenswrapper[4810]: I1003 08:22:10.320506 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:10 crc kubenswrapper[4810]: I1003 08:22:10.320927 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:10 crc kubenswrapper[4810]: I1003 08:22:10.361165 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:10 crc kubenswrapper[4810]: I1003 08:22:10.937529 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:11 crc kubenswrapper[4810]: I1003 08:22:11.011678 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:22:12 crc kubenswrapper[4810]: I1003 08:22:12.911336 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bwq4f" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="registry-server" containerID="cri-o://a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a" gracePeriod=2 Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.302862 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.496667 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities\") pod \"414aeb4e-a797-43a5-8af7-a526d90da3ab\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.496721 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzz4\" (UniqueName: \"kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4\") pod \"414aeb4e-a797-43a5-8af7-a526d90da3ab\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.496813 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content\") pod \"414aeb4e-a797-43a5-8af7-a526d90da3ab\" (UID: \"414aeb4e-a797-43a5-8af7-a526d90da3ab\") " Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.499573 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities" (OuterVolumeSpecName: "utilities") pod "414aeb4e-a797-43a5-8af7-a526d90da3ab" (UID: "414aeb4e-a797-43a5-8af7-a526d90da3ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.504455 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4" (OuterVolumeSpecName: "kube-api-access-ztzz4") pod "414aeb4e-a797-43a5-8af7-a526d90da3ab" (UID: "414aeb4e-a797-43a5-8af7-a526d90da3ab"). InnerVolumeSpecName "kube-api-access-ztzz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.511582 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "414aeb4e-a797-43a5-8af7-a526d90da3ab" (UID: "414aeb4e-a797-43a5-8af7-a526d90da3ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.598384 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.598438 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzz4\" (UniqueName: \"kubernetes.io/projected/414aeb4e-a797-43a5-8af7-a526d90da3ab-kube-api-access-ztzz4\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.598460 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/414aeb4e-a797-43a5-8af7-a526d90da3ab-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.921215 4810 generic.go:334] "Generic (PLEG): container finished" podID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerID="a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a" exitCode=0 Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.921270 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerDied","Data":"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a"} Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.921309 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bwq4f" event={"ID":"414aeb4e-a797-43a5-8af7-a526d90da3ab","Type":"ContainerDied","Data":"d242a79f0ba20b64b89138c2b4bc9a9697b52ab7cf41759808868aecc20bdbdb"} Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.921304 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bwq4f" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.921331 4810 scope.go:117] "RemoveContainer" containerID="a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.946370 4810 scope.go:117] "RemoveContainer" containerID="f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349" Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.974450 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.978961 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bwq4f"] Oct 03 08:22:13 crc kubenswrapper[4810]: I1003 08:22:13.980788 4810 scope.go:117] "RemoveContainer" containerID="d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.008983 4810 scope.go:117] "RemoveContainer" containerID="a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a" Oct 03 08:22:14 crc kubenswrapper[4810]: E1003 08:22:14.009456 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a\": container with ID starting with a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a not found: ID does not exist" containerID="a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.009517 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a"} err="failed to get container status \"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a\": rpc error: code = NotFound desc = could not find container \"a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a\": container with ID starting with a644e78a7280727cd5b5c6bae4927248ad91d46df0a8bd92fbfd27c2ec0cf72a not found: ID does not exist" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.009549 4810 scope.go:117] "RemoveContainer" containerID="f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349" Oct 03 08:22:14 crc kubenswrapper[4810]: E1003 08:22:14.010070 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349\": container with ID starting with f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349 not found: ID does not exist" containerID="f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.010311 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349"} err="failed to get container status \"f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349\": rpc error: code = NotFound desc = could not find container \"f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349\": container with ID starting with f7db7da5dcf60db3db4774180bb3d0b7054e4f4ef4866edcc7d08541c5ad6349 not found: ID does not exist" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.010530 4810 scope.go:117] "RemoveContainer" containerID="d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1" Oct 03 08:22:14 crc kubenswrapper[4810]: E1003 08:22:14.011591 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1\": container with ID starting with d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1 not found: ID does not exist" containerID="d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1" Oct 03 08:22:14 crc kubenswrapper[4810]: I1003 08:22:14.011625 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1"} err="failed to get container status \"d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1\": rpc error: code = NotFound desc = could not find container \"d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1\": container with ID starting with d18467ba58cfa719d307eddf1c9892221b5a939cd8106617d44dde3ad55f47e1 not found: ID does not exist" Oct 03 08:22:15 crc kubenswrapper[4810]: I1003 08:22:15.312872 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" path="/var/lib/kubelet/pods/414aeb4e-a797-43a5-8af7-a526d90da3ab/volumes" Oct 03 08:22:18 crc kubenswrapper[4810]: I1003 08:22:18.302346 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:22:18 crc kubenswrapper[4810]: E1003 08:22:18.302993 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:22:33 crc kubenswrapper[4810]: I1003 08:22:33.302640 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:22:33 crc kubenswrapper[4810]: E1003 08:22:33.303447 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:22:46 crc kubenswrapper[4810]: I1003 08:22:46.302595 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:22:46 crc kubenswrapper[4810]: E1003 08:22:46.303384 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:01 crc kubenswrapper[4810]: I1003 08:23:01.303482 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:23:01 crc kubenswrapper[4810]: E1003 08:23:01.304561 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:15 crc kubenswrapper[4810]: I1003 08:23:15.303444 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:23:15 crc kubenswrapper[4810]: E1003 08:23:15.304664 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:26 crc kubenswrapper[4810]: I1003 08:23:26.303796 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:23:26 crc kubenswrapper[4810]: E1003 08:23:26.304980 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:37 crc kubenswrapper[4810]: I1003 08:23:37.311039 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:23:37 crc kubenswrapper[4810]: E1003 08:23:37.312677 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:52 crc kubenswrapper[4810]: I1003 08:23:52.302616 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:23:52 crc kubenswrapper[4810]: E1003 08:23:52.303360 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.189425 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:23:53 crc kubenswrapper[4810]: E1003 08:23:53.189865 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="extract-utilities" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.189884 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="extract-utilities" Oct 03 08:23:53 crc kubenswrapper[4810]: E1003 08:23:53.189928 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="extract-content" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.189940 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="extract-content" Oct 03 08:23:53 crc kubenswrapper[4810]: E1003 08:23:53.189966 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="registry-server" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.189976 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="registry-server" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.190253 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="414aeb4e-a797-43a5-8af7-a526d90da3ab" containerName="registry-server" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.191797 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.213358 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.257747 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jrn\" (UniqueName: \"kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.257813 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.257932 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.359353 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jrn\" (UniqueName: \"kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.359412 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.359500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.360052 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.360294 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.390637 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jrn\" (UniqueName: \"kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn\") pod \"community-operators-l9hp9\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:53 crc kubenswrapper[4810]: I1003 08:23:53.530140 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:23:54 crc kubenswrapper[4810]: I1003 08:23:54.078965 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:23:54 crc kubenswrapper[4810]: I1003 08:23:54.779016 4810 generic.go:334] "Generic (PLEG): container finished" podID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerID="f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0" exitCode=0 Oct 03 08:23:54 crc kubenswrapper[4810]: I1003 08:23:54.779151 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerDied","Data":"f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0"} Oct 03 08:23:54 crc kubenswrapper[4810]: I1003 08:23:54.779205 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerStarted","Data":"71193fe573f4b8a845aa76c81677322029974ae330a5545265599e91e936ca0e"} Oct 03 08:23:54 crc kubenswrapper[4810]: I1003 08:23:54.783145 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:23:55 crc kubenswrapper[4810]: I1003 08:23:55.793940 4810 generic.go:334] "Generic (PLEG): container finished" podID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerID="018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4" exitCode=0 Oct 03 08:23:55 crc kubenswrapper[4810]: I1003 08:23:55.794260 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerDied","Data":"018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4"} Oct 03 08:23:56 crc kubenswrapper[4810]: I1003 08:23:56.816726 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerStarted","Data":"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f"} Oct 03 08:23:56 crc kubenswrapper[4810]: I1003 08:23:56.837547 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l9hp9" podStartSLOduration=2.2460550870000002 podStartE2EDuration="3.837522916s" podCreationTimestamp="2025-10-03 08:23:53 +0000 UTC" firstStartedPulling="2025-10-03 08:23:54.782225114 +0000 UTC m=+5268.209475889" lastFinishedPulling="2025-10-03 08:23:56.373692983 +0000 UTC m=+5269.800943718" observedRunningTime="2025-10-03 08:23:56.836702444 +0000 UTC m=+5270.263953189" watchObservedRunningTime="2025-10-03 08:23:56.837522916 +0000 UTC m=+5270.264773661" Oct 03 08:24:03 crc kubenswrapper[4810]: I1003 08:24:03.530489 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:03 crc kubenswrapper[4810]: I1003 08:24:03.531112 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:03 crc kubenswrapper[4810]: I1003 08:24:03.574197 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:03 crc kubenswrapper[4810]: I1003 08:24:03.926873 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:03 crc kubenswrapper[4810]: I1003 08:24:03.972347 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:24:05 crc kubenswrapper[4810]: I1003 08:24:05.888700 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l9hp9" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="registry-server" containerID="cri-o://b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f" gracePeriod=2 Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.292800 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.303037 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.461461 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities\") pod \"82094d26-cb9d-4098-877c-cc9012e0a2bd\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.462105 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4jrn\" (UniqueName: \"kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn\") pod \"82094d26-cb9d-4098-877c-cc9012e0a2bd\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.462250 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content\") pod \"82094d26-cb9d-4098-877c-cc9012e0a2bd\" (UID: \"82094d26-cb9d-4098-877c-cc9012e0a2bd\") " Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.462718 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities" (OuterVolumeSpecName: "utilities") pod "82094d26-cb9d-4098-877c-cc9012e0a2bd" (UID: "82094d26-cb9d-4098-877c-cc9012e0a2bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.463048 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.469229 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn" (OuterVolumeSpecName: "kube-api-access-d4jrn") pod "82094d26-cb9d-4098-877c-cc9012e0a2bd" (UID: "82094d26-cb9d-4098-877c-cc9012e0a2bd"). InnerVolumeSpecName "kube-api-access-d4jrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.518527 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82094d26-cb9d-4098-877c-cc9012e0a2bd" (UID: "82094d26-cb9d-4098-877c-cc9012e0a2bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.564334 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4jrn\" (UniqueName: \"kubernetes.io/projected/82094d26-cb9d-4098-877c-cc9012e0a2bd-kube-api-access-d4jrn\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.564365 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82094d26-cb9d-4098-877c-cc9012e0a2bd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.900491 4810 generic.go:334] "Generic (PLEG): container finished" podID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerID="b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f" exitCode=0 Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.900566 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerDied","Data":"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f"} Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.900608 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l9hp9" event={"ID":"82094d26-cb9d-4098-877c-cc9012e0a2bd","Type":"ContainerDied","Data":"71193fe573f4b8a845aa76c81677322029974ae330a5545265599e91e936ca0e"} Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.900632 4810 scope.go:117] "RemoveContainer" containerID="b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.900799 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l9hp9" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.915286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809"} Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.941021 4810 scope.go:117] "RemoveContainer" containerID="018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4" Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.966408 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.977976 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l9hp9"] Oct 03 08:24:06 crc kubenswrapper[4810]: I1003 08:24:06.983051 4810 scope.go:117] "RemoveContainer" containerID="f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.003944 4810 scope.go:117] "RemoveContainer" containerID="b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f" Oct 03 08:24:07 crc kubenswrapper[4810]: E1003 08:24:07.018515 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f\": container with ID starting with b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f not found: ID does not exist" containerID="b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.018602 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f"} err="failed to get container status \"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f\": rpc error: code = NotFound desc = could not find container \"b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f\": container with ID starting with b6b0edf7df827acd53534f667371276f47d63b6666f665e600d9febdcfe04e0f not found: ID does not exist" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.018675 4810 scope.go:117] "RemoveContainer" containerID="018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4" Oct 03 08:24:07 crc kubenswrapper[4810]: E1003 08:24:07.019606 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4\": container with ID starting with 018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4 not found: ID does not exist" containerID="018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.019711 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4"} err="failed to get container status \"018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4\": rpc error: code = NotFound desc = could not find container \"018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4\": container with ID starting with 018c7e0a74941a2921ee255afc083d53410f949c512c93172fe7104a12c518f4 not found: ID does not exist" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.019773 4810 scope.go:117] "RemoveContainer" containerID="f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0" Oct 03 08:24:07 crc kubenswrapper[4810]: E1003 08:24:07.020206 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0\": container with ID starting with f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0 not found: ID does not exist" containerID="f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.020244 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0"} err="failed to get container status \"f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0\": rpc error: code = NotFound desc = could not find container \"f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0\": container with ID starting with f73267c164ba0c50a5a2f0d4f324b7fe6c03badb49a7d682850fba8a1aeda7b0 not found: ID does not exist" Oct 03 08:24:07 crc kubenswrapper[4810]: I1003 08:24:07.312458 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" path="/var/lib/kubelet/pods/82094d26-cb9d-4098-877c-cc9012e0a2bd/volumes" Oct 03 08:26:32 crc kubenswrapper[4810]: I1003 08:26:32.088573 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:26:32 crc kubenswrapper[4810]: I1003 08:26:32.089364 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:27:02 crc kubenswrapper[4810]: I1003 08:27:02.088626 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:27:02 crc kubenswrapper[4810]: I1003 08:27:02.089395 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.088574 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.089332 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.089403 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.090393 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.090477 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809" gracePeriod=600 Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.584251 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809" exitCode=0 Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.584305 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809"} Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.584991 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b"} Oct 03 08:27:32 crc kubenswrapper[4810]: I1003 08:27:32.585061 4810 scope.go:117] "RemoveContainer" containerID="c23fba9f9ee2a68ffead58b03eb08b527c3bd0c3ca6c372cd306e32207ebfadc" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.891149 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:08 crc kubenswrapper[4810]: E1003 08:28:08.892297 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="extract-content" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.892314 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="extract-content" Oct 03 08:28:08 crc kubenswrapper[4810]: E1003 08:28:08.892326 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="extract-utilities" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.892336 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="extract-utilities" Oct 03 08:28:08 crc kubenswrapper[4810]: E1003 08:28:08.892357 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="registry-server" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.892366 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="registry-server" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.892834 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="82094d26-cb9d-4098-877c-cc9012e0a2bd" containerName="registry-server" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.895225 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:08 crc kubenswrapper[4810]: I1003 08:28:08.931241 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.042279 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2qz6\" (UniqueName: \"kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.042368 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.042497 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.143624 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2qz6\" (UniqueName: \"kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.143730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.143824 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.144186 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.144266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.164534 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2qz6\" (UniqueName: \"kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6\") pod \"redhat-operators-lpbth\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.236878 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.672783 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.935455 4810 generic.go:334] "Generic (PLEG): container finished" podID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerID="947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59" exitCode=0 Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.935640 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerDied","Data":"947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59"} Oct 03 08:28:09 crc kubenswrapper[4810]: I1003 08:28:09.935791 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerStarted","Data":"fae4dfb6bfdb8cf342078778d6382853f2e5e9af3a2d667d3bcc8331a89fb15f"} Oct 03 08:28:10 crc kubenswrapper[4810]: I1003 08:28:10.949385 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerStarted","Data":"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30"} Oct 03 08:28:11 crc kubenswrapper[4810]: I1003 08:28:11.961022 4810 generic.go:334] "Generic (PLEG): container finished" podID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerID="5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30" exitCode=0 Oct 03 08:28:11 crc kubenswrapper[4810]: I1003 08:28:11.961071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerDied","Data":"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30"} Oct 03 08:28:12 crc kubenswrapper[4810]: I1003 08:28:12.969523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerStarted","Data":"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278"} Oct 03 08:28:12 crc kubenswrapper[4810]: I1003 08:28:12.990341 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lpbth" podStartSLOduration=2.52643389 podStartE2EDuration="4.990323298s" podCreationTimestamp="2025-10-03 08:28:08 +0000 UTC" firstStartedPulling="2025-10-03 08:28:09.93695554 +0000 UTC m=+5523.364206275" lastFinishedPulling="2025-10-03 08:28:12.400844948 +0000 UTC m=+5525.828095683" observedRunningTime="2025-10-03 08:28:12.986967049 +0000 UTC m=+5526.414217784" watchObservedRunningTime="2025-10-03 08:28:12.990323298 +0000 UTC m=+5526.417574043" Oct 03 08:28:19 crc kubenswrapper[4810]: I1003 08:28:19.237782 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:19 crc kubenswrapper[4810]: I1003 08:28:19.238601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:19 crc kubenswrapper[4810]: I1003 08:28:19.288543 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:20 crc kubenswrapper[4810]: I1003 08:28:20.076338 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:20 crc kubenswrapper[4810]: I1003 08:28:20.125317 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:22 crc kubenswrapper[4810]: I1003 08:28:22.036075 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lpbth" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="registry-server" containerID="cri-o://021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278" gracePeriod=2 Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.007266 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.048940 4810 generic.go:334] "Generic (PLEG): container finished" podID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerID="021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278" exitCode=0 Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.049000 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerDied","Data":"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278"} Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.049030 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lpbth" event={"ID":"0f593c4a-2ba5-4f61-be60-25fe90744fa4","Type":"ContainerDied","Data":"fae4dfb6bfdb8cf342078778d6382853f2e5e9af3a2d667d3bcc8331a89fb15f"} Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.049067 4810 scope.go:117] "RemoveContainer" containerID="021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.049143 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lpbth" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.074143 4810 scope.go:117] "RemoveContainer" containerID="5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.088723 4810 scope.go:117] "RemoveContainer" containerID="947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.113566 4810 scope.go:117] "RemoveContainer" containerID="021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278" Oct 03 08:28:23 crc kubenswrapper[4810]: E1003 08:28:23.114158 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278\": container with ID starting with 021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278 not found: ID does not exist" containerID="021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.114197 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278"} err="failed to get container status \"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278\": rpc error: code = NotFound desc = could not find container \"021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278\": container with ID starting with 021ea4c0dc74e8fb5f6a0eb774979cc45ad8d66d49040c294422ef41e707b278 not found: ID does not exist" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.114224 4810 scope.go:117] "RemoveContainer" containerID="5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30" Oct 03 08:28:23 crc kubenswrapper[4810]: E1003 08:28:23.114680 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30\": container with ID starting with 5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30 not found: ID does not exist" containerID="5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.114701 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30"} err="failed to get container status \"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30\": rpc error: code = NotFound desc = could not find container \"5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30\": container with ID starting with 5649d46499c3c88de02684678020784dda069f2348ae90c7b88bc797a5b7cd30 not found: ID does not exist" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.114714 4810 scope.go:117] "RemoveContainer" containerID="947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59" Oct 03 08:28:23 crc kubenswrapper[4810]: E1003 08:28:23.115052 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59\": container with ID starting with 947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59 not found: ID does not exist" containerID="947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.115095 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59"} err="failed to get container status \"947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59\": rpc error: code = NotFound desc = could not find container \"947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59\": container with ID starting with 947ea6921a1df67cd76e93654c6fb1a362bce2467cd614091d915613cfedeb59 not found: ID does not exist" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.150545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities\") pod \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.150663 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content\") pod \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.150952 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2qz6\" (UniqueName: \"kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6\") pod \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\" (UID: \"0f593c4a-2ba5-4f61-be60-25fe90744fa4\") " Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.152915 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities" (OuterVolumeSpecName: "utilities") pod "0f593c4a-2ba5-4f61-be60-25fe90744fa4" (UID: "0f593c4a-2ba5-4f61-be60-25fe90744fa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.157223 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6" (OuterVolumeSpecName: "kube-api-access-m2qz6") pod "0f593c4a-2ba5-4f61-be60-25fe90744fa4" (UID: "0f593c4a-2ba5-4f61-be60-25fe90744fa4"). InnerVolumeSpecName "kube-api-access-m2qz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.240230 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f593c4a-2ba5-4f61-be60-25fe90744fa4" (UID: "0f593c4a-2ba5-4f61-be60-25fe90744fa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.252492 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.252525 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f593c4a-2ba5-4f61-be60-25fe90744fa4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.252538 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2qz6\" (UniqueName: \"kubernetes.io/projected/0f593c4a-2ba5-4f61-be60-25fe90744fa4-kube-api-access-m2qz6\") on node \"crc\" DevicePath \"\"" Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.375745 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:23 crc kubenswrapper[4810]: I1003 08:28:23.382360 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lpbth"] Oct 03 08:28:25 crc kubenswrapper[4810]: I1003 08:28:25.315000 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" path="/var/lib/kubelet/pods/0f593c4a-2ba5-4f61-be60-25fe90744fa4/volumes" Oct 03 08:29:32 crc kubenswrapper[4810]: I1003 08:29:32.088417 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:29:32 crc kubenswrapper[4810]: I1003 08:29:32.089324 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.152415 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4"] Oct 03 08:30:00 crc kubenswrapper[4810]: E1003 08:30:00.153445 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="extract-utilities" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.153464 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="extract-utilities" Oct 03 08:30:00 crc kubenswrapper[4810]: E1003 08:30:00.153484 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="extract-content" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.153491 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="extract-content" Oct 03 08:30:00 crc kubenswrapper[4810]: E1003 08:30:00.153516 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="registry-server" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.153522 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="registry-server" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.153699 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f593c4a-2ba5-4f61-be60-25fe90744fa4" containerName="registry-server" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.154342 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.156991 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.157072 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.161872 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4"] Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.246480 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.246948 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.247088 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdsh2\" (UniqueName: \"kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.348630 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.349255 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.349372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdsh2\" (UniqueName: \"kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.349752 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.357875 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.368557 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdsh2\" (UniqueName: \"kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2\") pod \"collect-profiles-29324670-jfdf4\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.479281 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:00 crc kubenswrapper[4810]: I1003 08:30:00.910382 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4"] Oct 03 08:30:01 crc kubenswrapper[4810]: I1003 08:30:01.867619 4810 generic.go:334] "Generic (PLEG): container finished" podID="9e716951-1a48-42f1-b790-e047fba7b477" containerID="d9f9bc46f0e813c021c190f7eba0554587ee3170bfe9298bcf82420f3c2a5429" exitCode=0 Oct 03 08:30:01 crc kubenswrapper[4810]: I1003 08:30:01.867735 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" event={"ID":"9e716951-1a48-42f1-b790-e047fba7b477","Type":"ContainerDied","Data":"d9f9bc46f0e813c021c190f7eba0554587ee3170bfe9298bcf82420f3c2a5429"} Oct 03 08:30:01 crc kubenswrapper[4810]: I1003 08:30:01.868064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" event={"ID":"9e716951-1a48-42f1-b790-e047fba7b477","Type":"ContainerStarted","Data":"a2fee8758f2ed0813aba4fc7698c91fc2026c7c7ecabab8e86a58430987e3939"} Oct 03 08:30:02 crc kubenswrapper[4810]: I1003 08:30:02.088785 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:30:02 crc kubenswrapper[4810]: I1003 08:30:02.088859 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.167111 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.331529 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume\") pod \"9e716951-1a48-42f1-b790-e047fba7b477\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.331833 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdsh2\" (UniqueName: \"kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2\") pod \"9e716951-1a48-42f1-b790-e047fba7b477\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.331924 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume\") pod \"9e716951-1a48-42f1-b790-e047fba7b477\" (UID: \"9e716951-1a48-42f1-b790-e047fba7b477\") " Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.332595 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e716951-1a48-42f1-b790-e047fba7b477" (UID: "9e716951-1a48-42f1-b790-e047fba7b477"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.337887 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e716951-1a48-42f1-b790-e047fba7b477" (UID: "9e716951-1a48-42f1-b790-e047fba7b477"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.342104 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2" (OuterVolumeSpecName: "kube-api-access-pdsh2") pod "9e716951-1a48-42f1-b790-e047fba7b477" (UID: "9e716951-1a48-42f1-b790-e047fba7b477"). InnerVolumeSpecName "kube-api-access-pdsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.434513 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e716951-1a48-42f1-b790-e047fba7b477-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.434555 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdsh2\" (UniqueName: \"kubernetes.io/projected/9e716951-1a48-42f1-b790-e047fba7b477-kube-api-access-pdsh2\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.434567 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e716951-1a48-42f1-b790-e047fba7b477-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.884965 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" event={"ID":"9e716951-1a48-42f1-b790-e047fba7b477","Type":"ContainerDied","Data":"a2fee8758f2ed0813aba4fc7698c91fc2026c7c7ecabab8e86a58430987e3939"} Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.885295 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2fee8758f2ed0813aba4fc7698c91fc2026c7c7ecabab8e86a58430987e3939" Oct 03 08:30:03 crc kubenswrapper[4810]: I1003 08:30:03.885032 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4" Oct 03 08:30:04 crc kubenswrapper[4810]: I1003 08:30:04.268219 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt"] Oct 03 08:30:04 crc kubenswrapper[4810]: I1003 08:30:04.273953 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324625-xc6zt"] Oct 03 08:30:05 crc kubenswrapper[4810]: I1003 08:30:05.322363 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68543805-1fb5-4859-8c83-85b363604567" path="/var/lib/kubelet/pods/68543805-1fb5-4859-8c83-85b363604567/volumes" Oct 03 08:30:15 crc kubenswrapper[4810]: I1003 08:30:15.079219 4810 scope.go:117] "RemoveContainer" containerID="2401285a921ee290e4e7e9187c697863dedfaa15d14d71ca8f9b427918dbde77" Oct 03 08:30:32 crc kubenswrapper[4810]: I1003 08:30:32.088323 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:30:32 crc kubenswrapper[4810]: I1003 08:30:32.088737 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:30:32 crc kubenswrapper[4810]: I1003 08:30:32.088781 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:30:32 crc kubenswrapper[4810]: I1003 08:30:32.089442 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:30:32 crc kubenswrapper[4810]: I1003 08:30:32.089488 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" gracePeriod=600 Oct 03 08:30:32 crc kubenswrapper[4810]: E1003 08:30:32.212742 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:30:33 crc kubenswrapper[4810]: I1003 08:30:33.139589 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" exitCode=0 Oct 03 08:30:33 crc kubenswrapper[4810]: I1003 08:30:33.139643 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b"} Oct 03 08:30:33 crc kubenswrapper[4810]: I1003 08:30:33.139699 4810 scope.go:117] "RemoveContainer" containerID="9bd4c3312f741a85d634bfcd94374989097bb36e469fb559f08cdf031a195809" Oct 03 08:30:33 crc kubenswrapper[4810]: I1003 08:30:33.140346 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:30:33 crc kubenswrapper[4810]: E1003 08:30:33.140665 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:30:46 crc kubenswrapper[4810]: I1003 08:30:46.302680 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:30:46 crc kubenswrapper[4810]: E1003 08:30:46.303540 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:31:00 crc kubenswrapper[4810]: I1003 08:31:00.302474 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:31:00 crc kubenswrapper[4810]: E1003 08:31:00.306888 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:31:15 crc kubenswrapper[4810]: I1003 08:31:15.303140 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:31:15 crc kubenswrapper[4810]: E1003 08:31:15.304084 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:31:30 crc kubenswrapper[4810]: I1003 08:31:30.302516 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:31:30 crc kubenswrapper[4810]: E1003 08:31:30.304341 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:31:42 crc kubenswrapper[4810]: I1003 08:31:42.302328 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:31:42 crc kubenswrapper[4810]: E1003 08:31:42.303976 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:31:57 crc kubenswrapper[4810]: I1003 08:31:57.307160 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:31:57 crc kubenswrapper[4810]: E1003 08:31:57.308083 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:32:09 crc kubenswrapper[4810]: I1003 08:32:09.303170 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:32:09 crc kubenswrapper[4810]: E1003 08:32:09.304075 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:32:21 crc kubenswrapper[4810]: I1003 08:32:21.302974 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:32:21 crc kubenswrapper[4810]: E1003 08:32:21.303849 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:32:34 crc kubenswrapper[4810]: I1003 08:32:34.303308 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:32:34 crc kubenswrapper[4810]: E1003 08:32:34.304485 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:32:48 crc kubenswrapper[4810]: I1003 08:32:48.303237 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:32:48 crc kubenswrapper[4810]: E1003 08:32:48.305269 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:33:01 crc kubenswrapper[4810]: I1003 08:33:01.303184 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:33:01 crc kubenswrapper[4810]: E1003 08:33:01.304312 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:33:14 crc kubenswrapper[4810]: I1003 08:33:14.304067 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:33:14 crc kubenswrapper[4810]: E1003 08:33:14.305080 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:33:16 crc kubenswrapper[4810]: I1003 08:33:16.881691 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:16 crc kubenswrapper[4810]: E1003 08:33:16.882023 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e716951-1a48-42f1-b790-e047fba7b477" containerName="collect-profiles" Oct 03 08:33:16 crc kubenswrapper[4810]: I1003 08:33:16.882035 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e716951-1a48-42f1-b790-e047fba7b477" containerName="collect-profiles" Oct 03 08:33:16 crc kubenswrapper[4810]: I1003 08:33:16.882167 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e716951-1a48-42f1-b790-e047fba7b477" containerName="collect-profiles" Oct 03 08:33:16 crc kubenswrapper[4810]: I1003 08:33:16.883775 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:16 crc kubenswrapper[4810]: I1003 08:33:16.902951 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.040363 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.040554 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsmqk\" (UniqueName: \"kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.040647 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.141759 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.141867 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsmqk\" (UniqueName: \"kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.141921 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.142397 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.142752 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.166940 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsmqk\" (UniqueName: \"kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk\") pod \"redhat-marketplace-pwrhq\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.213541 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:17 crc kubenswrapper[4810]: I1003 08:33:17.640520 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:18 crc kubenswrapper[4810]: I1003 08:33:18.528406 4810 generic.go:334] "Generic (PLEG): container finished" podID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerID="6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c" exitCode=0 Oct 03 08:33:18 crc kubenswrapper[4810]: I1003 08:33:18.528454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerDied","Data":"6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c"} Oct 03 08:33:18 crc kubenswrapper[4810]: I1003 08:33:18.528491 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerStarted","Data":"c0a180233ab2dcabdaf7dda2b4af04ad7dd597f09b9da231b335894e7583ec2e"} Oct 03 08:33:18 crc kubenswrapper[4810]: I1003 08:33:18.531289 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:33:20 crc kubenswrapper[4810]: I1003 08:33:20.547307 4810 generic.go:334] "Generic (PLEG): container finished" podID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerID="d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a" exitCode=0 Oct 03 08:33:20 crc kubenswrapper[4810]: I1003 08:33:20.547396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerDied","Data":"d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a"} Oct 03 08:33:21 crc kubenswrapper[4810]: I1003 08:33:21.561125 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerStarted","Data":"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb"} Oct 03 08:33:21 crc kubenswrapper[4810]: I1003 08:33:21.588005 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pwrhq" podStartSLOduration=3.150927753 podStartE2EDuration="5.587828653s" podCreationTimestamp="2025-10-03 08:33:16 +0000 UTC" firstStartedPulling="2025-10-03 08:33:18.531033665 +0000 UTC m=+5831.958284410" lastFinishedPulling="2025-10-03 08:33:20.967934565 +0000 UTC m=+5834.395185310" observedRunningTime="2025-10-03 08:33:21.579609355 +0000 UTC m=+5835.006860170" watchObservedRunningTime="2025-10-03 08:33:21.587828653 +0000 UTC m=+5835.015079428" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.214496 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.215128 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.258114 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.306619 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:33:27 crc kubenswrapper[4810]: E1003 08:33:27.306885 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.657794 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:27 crc kubenswrapper[4810]: I1003 08:33:27.708824 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:29 crc kubenswrapper[4810]: I1003 08:33:29.629931 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pwrhq" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="registry-server" containerID="cri-o://2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb" gracePeriod=2 Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.521277 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.640328 4810 generic.go:334] "Generic (PLEG): container finished" podID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerID="2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb" exitCode=0 Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.640385 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerDied","Data":"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb"} Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.640400 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwrhq" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.640424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwrhq" event={"ID":"393b9a68-0d6f-4373-9ef9-ef1407d71ff4","Type":"ContainerDied","Data":"c0a180233ab2dcabdaf7dda2b4af04ad7dd597f09b9da231b335894e7583ec2e"} Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.640444 4810 scope.go:117] "RemoveContainer" containerID="2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.654712 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content\") pod \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.654843 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsmqk\" (UniqueName: \"kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk\") pod \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.655042 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities\") pod \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\" (UID: \"393b9a68-0d6f-4373-9ef9-ef1407d71ff4\") " Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.656252 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities" (OuterVolumeSpecName: "utilities") pod "393b9a68-0d6f-4373-9ef9-ef1407d71ff4" (UID: "393b9a68-0d6f-4373-9ef9-ef1407d71ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.658394 4810 scope.go:117] "RemoveContainer" containerID="d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.659961 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk" (OuterVolumeSpecName: "kube-api-access-qsmqk") pod "393b9a68-0d6f-4373-9ef9-ef1407d71ff4" (UID: "393b9a68-0d6f-4373-9ef9-ef1407d71ff4"). InnerVolumeSpecName "kube-api-access-qsmqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.671827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "393b9a68-0d6f-4373-9ef9-ef1407d71ff4" (UID: "393b9a68-0d6f-4373-9ef9-ef1407d71ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.698346 4810 scope.go:117] "RemoveContainer" containerID="6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.730771 4810 scope.go:117] "RemoveContainer" containerID="2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb" Oct 03 08:33:30 crc kubenswrapper[4810]: E1003 08:33:30.731406 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb\": container with ID starting with 2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb not found: ID does not exist" containerID="2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.731462 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb"} err="failed to get container status \"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb\": rpc error: code = NotFound desc = could not find container \"2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb\": container with ID starting with 2620d60115353a18876ac5757ab364839c71c945e43040c170426b401531e9bb not found: ID does not exist" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.731498 4810 scope.go:117] "RemoveContainer" containerID="d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a" Oct 03 08:33:30 crc kubenswrapper[4810]: E1003 08:33:30.732970 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a\": container with ID starting with d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a not found: ID does not exist" containerID="d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.733027 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a"} err="failed to get container status \"d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a\": rpc error: code = NotFound desc = could not find container \"d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a\": container with ID starting with d6c8c55696d8a72eae2ad5a1d03705e47be9c87aebb8155dcdf2994ebbfa7d1a not found: ID does not exist" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.733060 4810 scope.go:117] "RemoveContainer" containerID="6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c" Oct 03 08:33:30 crc kubenswrapper[4810]: E1003 08:33:30.733707 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c\": container with ID starting with 6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c not found: ID does not exist" containerID="6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.733775 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c"} err="failed to get container status \"6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c\": rpc error: code = NotFound desc = could not find container \"6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c\": container with ID starting with 6dcd92da19267f2741ae3ec50f3b00f26c3ad317dd1789551f4e02801b4f3d3c not found: ID does not exist" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.756591 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.756626 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsmqk\" (UniqueName: \"kubernetes.io/projected/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-kube-api-access-qsmqk\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.756639 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/393b9a68-0d6f-4373-9ef9-ef1407d71ff4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.989202 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:30 crc kubenswrapper[4810]: I1003 08:33:30.994546 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwrhq"] Oct 03 08:33:31 crc kubenswrapper[4810]: I1003 08:33:31.311481 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" path="/var/lib/kubelet/pods/393b9a68-0d6f-4373-9ef9-ef1407d71ff4/volumes" Oct 03 08:33:39 crc kubenswrapper[4810]: I1003 08:33:39.303359 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:33:39 crc kubenswrapper[4810]: E1003 08:33:39.304345 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:33:54 crc kubenswrapper[4810]: I1003 08:33:54.302871 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:33:54 crc kubenswrapper[4810]: E1003 08:33:54.305239 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:34:07 crc kubenswrapper[4810]: I1003 08:34:07.315487 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:34:07 crc kubenswrapper[4810]: E1003 08:34:07.316721 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:34:22 crc kubenswrapper[4810]: I1003 08:34:22.302259 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:34:22 crc kubenswrapper[4810]: E1003 08:34:22.303183 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:34:33 crc kubenswrapper[4810]: I1003 08:34:33.302209 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:34:33 crc kubenswrapper[4810]: E1003 08:34:33.303146 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.940341 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:34 crc kubenswrapper[4810]: E1003 08:34:34.940851 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="registry-server" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.940868 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="registry-server" Oct 03 08:34:34 crc kubenswrapper[4810]: E1003 08:34:34.940926 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="extract-content" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.940934 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="extract-content" Oct 03 08:34:34 crc kubenswrapper[4810]: E1003 08:34:34.940949 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="extract-utilities" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.940956 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="extract-utilities" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.941120 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="393b9a68-0d6f-4373-9ef9-ef1407d71ff4" containerName="registry-server" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.942387 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:34 crc kubenswrapper[4810]: I1003 08:34:34.955810 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.138428 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.138547 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f99fp\" (UniqueName: \"kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.138614 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.240218 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.240350 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f99fp\" (UniqueName: \"kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.240401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.240950 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.241129 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.266548 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f99fp\" (UniqueName: \"kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp\") pod \"community-operators-zt4xq\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:35 crc kubenswrapper[4810]: I1003 08:34:35.564647 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:36 crc kubenswrapper[4810]: I1003 08:34:36.038182 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:36 crc kubenswrapper[4810]: I1003 08:34:36.197159 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerStarted","Data":"30b143179bb1f372deafc18d236aa6f4baea500bb712a29033a2e43c2be345f0"} Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.208276 4810 generic.go:334] "Generic (PLEG): container finished" podID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerID="9e01e58d9fb1d99e2d6b062e22ac372efe423492175eb3f0abb73a64d6c537b3" exitCode=0 Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.209693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerDied","Data":"9e01e58d9fb1d99e2d6b062e22ac372efe423492175eb3f0abb73a64d6c537b3"} Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.339681 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.344085 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.350125 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.387003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.387135 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wkng\" (UniqueName: \"kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.388197 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.488924 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.489020 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.489073 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wkng\" (UniqueName: \"kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.489607 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.489743 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.520074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wkng\" (UniqueName: \"kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng\") pod \"certified-operators-wmb4p\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.675498 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:37 crc kubenswrapper[4810]: I1003 08:34:37.974514 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:38 crc kubenswrapper[4810]: I1003 08:34:38.219446 4810 generic.go:334] "Generic (PLEG): container finished" podID="20416976-97df-4e15-a23f-2b83e537c3d1" containerID="90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2" exitCode=0 Oct 03 08:34:38 crc kubenswrapper[4810]: I1003 08:34:38.219493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerDied","Data":"90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2"} Oct 03 08:34:38 crc kubenswrapper[4810]: I1003 08:34:38.220818 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerStarted","Data":"124f1aa5357448b0e6cf03041f2457ad51d9e9b1b8f9d2c04edc1a4555b921f8"} Oct 03 08:34:38 crc kubenswrapper[4810]: I1003 08:34:38.223854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerStarted","Data":"9efc9aee58c5edce44c3c8bd2d7df2cd60892883373911b2c60d5808c644051a"} Oct 03 08:34:39 crc kubenswrapper[4810]: I1003 08:34:39.231807 4810 generic.go:334] "Generic (PLEG): container finished" podID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerID="9efc9aee58c5edce44c3c8bd2d7df2cd60892883373911b2c60d5808c644051a" exitCode=0 Oct 03 08:34:39 crc kubenswrapper[4810]: I1003 08:34:39.231885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerDied","Data":"9efc9aee58c5edce44c3c8bd2d7df2cd60892883373911b2c60d5808c644051a"} Oct 03 08:34:39 crc kubenswrapper[4810]: I1003 08:34:39.234642 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerStarted","Data":"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03"} Oct 03 08:34:40 crc kubenswrapper[4810]: I1003 08:34:40.245486 4810 generic.go:334] "Generic (PLEG): container finished" podID="20416976-97df-4e15-a23f-2b83e537c3d1" containerID="a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03" exitCode=0 Oct 03 08:34:40 crc kubenswrapper[4810]: I1003 08:34:40.245604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerDied","Data":"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03"} Oct 03 08:34:40 crc kubenswrapper[4810]: I1003 08:34:40.250945 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerStarted","Data":"53796507c9d020c25fddd6989f8a0387331b050eb653f256d6980512136d1490"} Oct 03 08:34:40 crc kubenswrapper[4810]: I1003 08:34:40.289980 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zt4xq" podStartSLOduration=3.813074138 podStartE2EDuration="6.289963661s" podCreationTimestamp="2025-10-03 08:34:34 +0000 UTC" firstStartedPulling="2025-10-03 08:34:37.210836549 +0000 UTC m=+5910.638087304" lastFinishedPulling="2025-10-03 08:34:39.687726092 +0000 UTC m=+5913.114976827" observedRunningTime="2025-10-03 08:34:40.285321408 +0000 UTC m=+5913.712572143" watchObservedRunningTime="2025-10-03 08:34:40.289963661 +0000 UTC m=+5913.717214396" Oct 03 08:34:41 crc kubenswrapper[4810]: I1003 08:34:41.260808 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerStarted","Data":"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343"} Oct 03 08:34:41 crc kubenswrapper[4810]: I1003 08:34:41.282081 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wmb4p" podStartSLOduration=1.622645437 podStartE2EDuration="4.282063739s" podCreationTimestamp="2025-10-03 08:34:37 +0000 UTC" firstStartedPulling="2025-10-03 08:34:38.221428077 +0000 UTC m=+5911.648678812" lastFinishedPulling="2025-10-03 08:34:40.880846379 +0000 UTC m=+5914.308097114" observedRunningTime="2025-10-03 08:34:41.280450186 +0000 UTC m=+5914.707700921" watchObservedRunningTime="2025-10-03 08:34:41.282063739 +0000 UTC m=+5914.709314474" Oct 03 08:34:45 crc kubenswrapper[4810]: I1003 08:34:45.565542 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:45 crc kubenswrapper[4810]: I1003 08:34:45.565956 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:45 crc kubenswrapper[4810]: I1003 08:34:45.625008 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:46 crc kubenswrapper[4810]: I1003 08:34:46.339438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:46 crc kubenswrapper[4810]: I1003 08:34:46.383152 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:47 crc kubenswrapper[4810]: I1003 08:34:47.676112 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:47 crc kubenswrapper[4810]: I1003 08:34:47.676447 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:47 crc kubenswrapper[4810]: I1003 08:34:47.717601 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:48 crc kubenswrapper[4810]: I1003 08:34:48.302141 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:34:48 crc kubenswrapper[4810]: E1003 08:34:48.302465 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:34:48 crc kubenswrapper[4810]: I1003 08:34:48.312391 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zt4xq" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="registry-server" containerID="cri-o://53796507c9d020c25fddd6989f8a0387331b050eb653f256d6980512136d1490" gracePeriod=2 Oct 03 08:34:48 crc kubenswrapper[4810]: I1003 08:34:48.356967 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:49 crc kubenswrapper[4810]: I1003 08:34:49.264534 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:49 crc kubenswrapper[4810]: I1003 08:34:49.320809 4810 generic.go:334] "Generic (PLEG): container finished" podID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerID="53796507c9d020c25fddd6989f8a0387331b050eb653f256d6980512136d1490" exitCode=0 Oct 03 08:34:49 crc kubenswrapper[4810]: I1003 08:34:49.320930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerDied","Data":"53796507c9d020c25fddd6989f8a0387331b050eb653f256d6980512136d1490"} Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.273981 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.366332 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wmb4p" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="registry-server" containerID="cri-o://8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343" gracePeriod=2 Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.366738 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zt4xq" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.367098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zt4xq" event={"ID":"64e54065-754e-42f0-ab1a-4ab4ad78b0b5","Type":"ContainerDied","Data":"30b143179bb1f372deafc18d236aa6f4baea500bb712a29033a2e43c2be345f0"} Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.367143 4810 scope.go:117] "RemoveContainer" containerID="53796507c9d020c25fddd6989f8a0387331b050eb653f256d6980512136d1490" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.389041 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities\") pod \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.389153 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f99fp\" (UniqueName: \"kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp\") pod \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.389193 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content\") pod \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\" (UID: \"64e54065-754e-42f0-ab1a-4ab4ad78b0b5\") " Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.392111 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities" (OuterVolumeSpecName: "utilities") pod "64e54065-754e-42f0-ab1a-4ab4ad78b0b5" (UID: "64e54065-754e-42f0-ab1a-4ab4ad78b0b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.417099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp" (OuterVolumeSpecName: "kube-api-access-f99fp") pod "64e54065-754e-42f0-ab1a-4ab4ad78b0b5" (UID: "64e54065-754e-42f0-ab1a-4ab4ad78b0b5"). InnerVolumeSpecName "kube-api-access-f99fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.422533 4810 scope.go:117] "RemoveContainer" containerID="9efc9aee58c5edce44c3c8bd2d7df2cd60892883373911b2c60d5808c644051a" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.446409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64e54065-754e-42f0-ab1a-4ab4ad78b0b5" (UID: "64e54065-754e-42f0-ab1a-4ab4ad78b0b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.455212 4810 scope.go:117] "RemoveContainer" containerID="9e01e58d9fb1d99e2d6b062e22ac372efe423492175eb3f0abb73a64d6c537b3" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.490702 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f99fp\" (UniqueName: \"kubernetes.io/projected/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-kube-api-access-f99fp\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.490733 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.490742 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64e54065-754e-42f0-ab1a-4ab4ad78b0b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.697460 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:50 crc kubenswrapper[4810]: I1003 08:34:50.702041 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zt4xq"] Oct 03 08:34:51 crc kubenswrapper[4810]: I1003 08:34:51.314787 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" path="/var/lib/kubelet/pods/64e54065-754e-42f0-ab1a-4ab4ad78b0b5/volumes" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:51.913989 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.112959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wkng\" (UniqueName: \"kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng\") pod \"20416976-97df-4e15-a23f-2b83e537c3d1\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.114293 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content\") pod \"20416976-97df-4e15-a23f-2b83e537c3d1\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.114369 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities\") pod \"20416976-97df-4e15-a23f-2b83e537c3d1\" (UID: \"20416976-97df-4e15-a23f-2b83e537c3d1\") " Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.115630 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities" (OuterVolumeSpecName: "utilities") pod "20416976-97df-4e15-a23f-2b83e537c3d1" (UID: "20416976-97df-4e15-a23f-2b83e537c3d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.120163 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng" (OuterVolumeSpecName: "kube-api-access-7wkng") pod "20416976-97df-4e15-a23f-2b83e537c3d1" (UID: "20416976-97df-4e15-a23f-2b83e537c3d1"). InnerVolumeSpecName "kube-api-access-7wkng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.181928 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20416976-97df-4e15-a23f-2b83e537c3d1" (UID: "20416976-97df-4e15-a23f-2b83e537c3d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.216335 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wkng\" (UniqueName: \"kubernetes.io/projected/20416976-97df-4e15-a23f-2b83e537c3d1-kube-api-access-7wkng\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.216423 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.216445 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20416976-97df-4e15-a23f-2b83e537c3d1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.384361 4810 generic.go:334] "Generic (PLEG): container finished" podID="20416976-97df-4e15-a23f-2b83e537c3d1" containerID="8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343" exitCode=0 Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.384409 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerDied","Data":"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343"} Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.384467 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmb4p" event={"ID":"20416976-97df-4e15-a23f-2b83e537c3d1","Type":"ContainerDied","Data":"124f1aa5357448b0e6cf03041f2457ad51d9e9b1b8f9d2c04edc1a4555b921f8"} Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.384487 4810 scope.go:117] "RemoveContainer" containerID="8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.384484 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmb4p" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.410358 4810 scope.go:117] "RemoveContainer" containerID="a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.422750 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.429080 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wmb4p"] Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.447927 4810 scope.go:117] "RemoveContainer" containerID="90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.471335 4810 scope.go:117] "RemoveContainer" containerID="8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343" Oct 03 08:34:52 crc kubenswrapper[4810]: E1003 08:34:52.472197 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343\": container with ID starting with 8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343 not found: ID does not exist" containerID="8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.472238 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343"} err="failed to get container status \"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343\": rpc error: code = NotFound desc = could not find container \"8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343\": container with ID starting with 8b8cffcc9d36b9d842c3d0d3b0effd8e4105c4c4d6d9eb53b6ead5a32aac4343 not found: ID does not exist" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.472264 4810 scope.go:117] "RemoveContainer" containerID="a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03" Oct 03 08:34:52 crc kubenswrapper[4810]: E1003 08:34:52.472549 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03\": container with ID starting with a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03 not found: ID does not exist" containerID="a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.472581 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03"} err="failed to get container status \"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03\": rpc error: code = NotFound desc = could not find container \"a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03\": container with ID starting with a9725ffea11c61126771cb32847d9c26613758a392bd753318e0665731f77e03 not found: ID does not exist" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.472604 4810 scope.go:117] "RemoveContainer" containerID="90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2" Oct 03 08:34:52 crc kubenswrapper[4810]: E1003 08:34:52.473115 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2\": container with ID starting with 90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2 not found: ID does not exist" containerID="90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2" Oct 03 08:34:52 crc kubenswrapper[4810]: I1003 08:34:52.473133 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2"} err="failed to get container status \"90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2\": rpc error: code = NotFound desc = could not find container \"90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2\": container with ID starting with 90b24d98352179598b4eb062db412447c5adb05a7f03ace5719b8489f0e32fe2 not found: ID does not exist" Oct 03 08:34:53 crc kubenswrapper[4810]: I1003 08:34:53.312228 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" path="/var/lib/kubelet/pods/20416976-97df-4e15-a23f-2b83e537c3d1/volumes" Oct 03 08:35:03 crc kubenswrapper[4810]: I1003 08:35:03.302492 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:35:03 crc kubenswrapper[4810]: E1003 08:35:03.303326 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:35:17 crc kubenswrapper[4810]: I1003 08:35:17.307450 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:35:17 crc kubenswrapper[4810]: E1003 08:35:17.308438 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:35:29 crc kubenswrapper[4810]: I1003 08:35:29.302409 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:35:29 crc kubenswrapper[4810]: E1003 08:35:29.303305 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:35:40 crc kubenswrapper[4810]: I1003 08:35:40.302435 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:35:40 crc kubenswrapper[4810]: I1003 08:35:40.780816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326"} Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.817811 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-h875q"] Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.825285 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-h875q"] Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.955535 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-gmxwn"] Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956143 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="extract-utilities" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956228 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="extract-utilities" Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956299 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="extract-utilities" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956361 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="extract-utilities" Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956419 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="extract-content" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956471 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="extract-content" Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956528 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956575 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956631 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956676 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: E1003 08:36:12.956725 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="extract-content" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.956804 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="extract-content" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.957010 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="20416976-97df-4e15-a23f-2b83e537c3d1" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.957090 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e54065-754e-42f0-ab1a-4ab4ad78b0b5" containerName="registry-server" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.957619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.959345 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.959589 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.962423 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.962423 4810 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-fk9ct" Oct 03 08:36:12 crc kubenswrapper[4810]: I1003 08:36:12.973034 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gmxwn"] Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.025384 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.025429 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.025467 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9xm\" (UniqueName: \"kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.126572 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.126622 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.126648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9xm\" (UniqueName: \"kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.126948 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.127571 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.145189 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9xm\" (UniqueName: \"kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm\") pod \"crc-storage-crc-gmxwn\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.275191 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.316968 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6897b018-d1be-4256-9e81-6f8836a0e303" path="/var/lib/kubelet/pods/6897b018-d1be-4256-9e81-6f8836a0e303/volumes" Oct 03 08:36:13 crc kubenswrapper[4810]: I1003 08:36:13.698525 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-gmxwn"] Oct 03 08:36:14 crc kubenswrapper[4810]: I1003 08:36:14.036476 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gmxwn" event={"ID":"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22","Type":"ContainerStarted","Data":"ac2f9cfb8dd97e21b3cb07f03ab106fe1fa2e430c4ac56c59bd29ab9bbec8ad9"} Oct 03 08:36:15 crc kubenswrapper[4810]: I1003 08:36:15.048296 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" containerID="eef72c9d2ea64b267c5d3fe8f64d415b95dbf0904b3fbca6a49cedafc798d606" exitCode=0 Oct 03 08:36:15 crc kubenswrapper[4810]: I1003 08:36:15.048487 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gmxwn" event={"ID":"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22","Type":"ContainerDied","Data":"eef72c9d2ea64b267c5d3fe8f64d415b95dbf0904b3fbca6a49cedafc798d606"} Oct 03 08:36:15 crc kubenswrapper[4810]: I1003 08:36:15.259446 4810 scope.go:117] "RemoveContainer" containerID="8f72f1b66e40a704b17eddf7c1b2fd9c946eedbb4ca304e3a7367a3d6c63fc82" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.372759 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.483964 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt\") pod \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.484095 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm9xm\" (UniqueName: \"kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm\") pod \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.484185 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage\") pod \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\" (UID: \"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22\") " Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.484349 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" (UID: "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.484700 4810 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.490189 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm" (OuterVolumeSpecName: "kube-api-access-xm9xm") pod "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" (UID: "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22"). InnerVolumeSpecName "kube-api-access-xm9xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.506596 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" (UID: "8a9d34d0-974b-4f29-aed9-dbc47bf8bc22"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.586947 4810 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:16 crc kubenswrapper[4810]: I1003 08:36:16.587001 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm9xm\" (UniqueName: \"kubernetes.io/projected/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22-kube-api-access-xm9xm\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:17 crc kubenswrapper[4810]: I1003 08:36:17.071323 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-gmxwn" event={"ID":"8a9d34d0-974b-4f29-aed9-dbc47bf8bc22","Type":"ContainerDied","Data":"ac2f9cfb8dd97e21b3cb07f03ab106fe1fa2e430c4ac56c59bd29ab9bbec8ad9"} Oct 03 08:36:17 crc kubenswrapper[4810]: I1003 08:36:17.071399 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2f9cfb8dd97e21b3cb07f03ab106fe1fa2e430c4ac56c59bd29ab9bbec8ad9" Oct 03 08:36:17 crc kubenswrapper[4810]: I1003 08:36:17.071518 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-gmxwn" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.644484 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-gmxwn"] Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.649440 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-gmxwn"] Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.768314 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zjscb"] Oct 03 08:36:18 crc kubenswrapper[4810]: E1003 08:36:18.768627 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" containerName="storage" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.768645 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" containerName="storage" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.768812 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" containerName="storage" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.769361 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.772010 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.772216 4810 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-fk9ct" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.772496 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.773584 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.787579 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zjscb"] Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.923383 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.923461 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhjj\" (UniqueName: \"kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:18 crc kubenswrapper[4810]: I1003 08:36:18.924101 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.026215 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.026360 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.026423 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhjj\" (UniqueName: \"kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.026808 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.027337 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.049464 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhjj\" (UniqueName: \"kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj\") pod \"crc-storage-crc-zjscb\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.086021 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.312866 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9d34d0-974b-4f29-aed9-dbc47bf8bc22" path="/var/lib/kubelet/pods/8a9d34d0-974b-4f29-aed9-dbc47bf8bc22/volumes" Oct 03 08:36:19 crc kubenswrapper[4810]: I1003 08:36:19.497234 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zjscb"] Oct 03 08:36:20 crc kubenswrapper[4810]: I1003 08:36:20.101234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zjscb" event={"ID":"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25","Type":"ContainerStarted","Data":"10a86534e5ad5058012689d34de8aa37461e7d13b42fcd6d9df9b93662d55e2d"} Oct 03 08:36:21 crc kubenswrapper[4810]: I1003 08:36:21.113201 4810 generic.go:334] "Generic (PLEG): container finished" podID="d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" containerID="63ca3e9a4c179c3af75c971c4900ac27f6a45129906c004cdaa4a977045d791e" exitCode=0 Oct 03 08:36:21 crc kubenswrapper[4810]: I1003 08:36:21.113280 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zjscb" event={"ID":"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25","Type":"ContainerDied","Data":"63ca3e9a4c179c3af75c971c4900ac27f6a45129906c004cdaa4a977045d791e"} Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.464442 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.582233 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage\") pod \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.582291 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdhjj\" (UniqueName: \"kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj\") pod \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.582623 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt\") pod \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\" (UID: \"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25\") " Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.582808 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" (UID: "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.583121 4810 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.588960 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj" (OuterVolumeSpecName: "kube-api-access-cdhjj") pod "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" (UID: "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25"). InnerVolumeSpecName "kube-api-access-cdhjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.604378 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" (UID: "d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.684410 4810 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:22 crc kubenswrapper[4810]: I1003 08:36:22.684457 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdhjj\" (UniqueName: \"kubernetes.io/projected/d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25-kube-api-access-cdhjj\") on node \"crc\" DevicePath \"\"" Oct 03 08:36:23 crc kubenswrapper[4810]: I1003 08:36:23.130688 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zjscb" event={"ID":"d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25","Type":"ContainerDied","Data":"10a86534e5ad5058012689d34de8aa37461e7d13b42fcd6d9df9b93662d55e2d"} Oct 03 08:36:23 crc kubenswrapper[4810]: I1003 08:36:23.130739 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zjscb" Oct 03 08:36:23 crc kubenswrapper[4810]: I1003 08:36:23.130755 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a86534e5ad5058012689d34de8aa37461e7d13b42fcd6d9df9b93662d55e2d" Oct 03 08:38:02 crc kubenswrapper[4810]: I1003 08:38:02.089621 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:38:02 crc kubenswrapper[4810]: I1003 08:38:02.090523 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.239620 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:38:31 crc kubenswrapper[4810]: E1003 08:38:31.240487 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" containerName="storage" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.240499 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" containerName="storage" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.240633 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bccbb0-a6ec-46f2-a4b4-5fd9e510fa25" containerName="storage" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.241335 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.243570 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.243763 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.244679 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-r6q8v" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.253397 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.259057 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.271523 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.283937 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.288490 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.358783 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvrr\" (UniqueName: \"kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.358928 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.359012 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qgl\" (UniqueName: \"kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.359067 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.359085 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.359429 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.463937 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qgl\" (UniqueName: \"kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.464002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.464044 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.464120 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvrr\" (UniqueName: \"kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.464156 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.465054 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.465143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.465340 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.488143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qgl\" (UniqueName: \"kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl\") pod \"dnsmasq-dns-55cbcdb7f9-p9pfb\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.505778 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvrr\" (UniqueName: \"kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr\") pod \"dnsmasq-dns-6697c69467-p9j7d\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.559206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.571093 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.616542 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.623519 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.637674 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.656664 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.669631 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.669703 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.669726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88vqn\" (UniqueName: \"kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.771477 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.771904 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.771957 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88vqn\" (UniqueName: \"kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.773353 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.773954 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.801584 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88vqn\" (UniqueName: \"kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn\") pod \"dnsmasq-dns-5979bfbbf7-prj2j\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:31 crc kubenswrapper[4810]: I1003 08:38:31.993321 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.047068 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.070774 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.073843 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.080205 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.088628 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.088672 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:38:32 crc kubenswrapper[4810]: W1003 08:38:32.146120 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bb6322_c010_4fca_8eb0_6e6622a349a8.slice/crio-7439b622b25db4c82921659ce1b48f7c17ec3c57182ad12885cc802535004860 WatchSource:0}: Error finding container 7439b622b25db4c82921659ce1b48f7c17ec3c57182ad12885cc802535004860: Status 404 returned error can't find the container with id 7439b622b25db4c82921659ce1b48f7c17ec3c57182ad12885cc802535004860 Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.148595 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.151997 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.177253 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.177294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn4v5\" (UniqueName: \"kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.177314 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.182125 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" event={"ID":"a0bb6322-c010-4fca-8eb0-6e6622a349a8","Type":"ContainerStarted","Data":"7439b622b25db4c82921659ce1b48f7c17ec3c57182ad12885cc802535004860"} Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.247622 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.278350 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.278411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn4v5\" (UniqueName: \"kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.278433 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.279291 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.279395 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.302171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn4v5\" (UniqueName: \"kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5\") pod \"dnsmasq-dns-6cfd7b4c45-7kcb8\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.388706 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.547944 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:38:32 crc kubenswrapper[4810]: W1003 08:38:32.573348 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c30896_1241_4e39_a61e_0fc8feb76dde.slice/crio-0b4cd9294e6f45a54b4e9166cb3790786d4f5949493b34ae468bb08256578118 WatchSource:0}: Error finding container 0b4cd9294e6f45a54b4e9166cb3790786d4f5949493b34ae468bb08256578118: Status 404 returned error can't find the container with id 0b4cd9294e6f45a54b4e9166cb3790786d4f5949493b34ae468bb08256578118 Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.837554 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.839645 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.841945 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.842078 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.842250 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.842357 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.842436 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.842491 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-59wtf" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.843718 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.844310 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.895419 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.895980 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fthr\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896116 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896164 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896254 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.896307 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.902322 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:38:32 crc kubenswrapper[4810]: W1003 08:38:32.928142 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76b624fe_023f_4310_a261_9a0cd0d1c08e.slice/crio-b1366e1ede930a332f2739d2c06e1ac08c704a0258d9c436ac998148b955802d WatchSource:0}: Error finding container b1366e1ede930a332f2739d2c06e1ac08c704a0258d9c436ac998148b955802d: Status 404 returned error can't find the container with id b1366e1ede930a332f2739d2c06e1ac08c704a0258d9c436ac998148b955802d Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997531 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997583 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997657 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997715 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fthr\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997754 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997770 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997798 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.997818 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:32 crc kubenswrapper[4810]: I1003 08:38:32.999097 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.000670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.000670 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.000764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.001034 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.004667 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.004720 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c193bc2f7c1b709308b46ace7f1403c4c72a6486c542920bfcbeb5c8bb233cde/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.006648 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.006722 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.007865 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.011327 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.020852 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fthr\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.049501 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.171356 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.189774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" event={"ID":"93aef6e1-0bc8-4264-af86-523ecfdf9d63","Type":"ContainerStarted","Data":"502dc80e1b6b5bfd9a48240622d974fba3dded73b5b493374b0642ad16121061"} Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.191602 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" event={"ID":"b9c30896-1241-4e39-a61e-0fc8feb76dde","Type":"ContainerStarted","Data":"0b4cd9294e6f45a54b4e9166cb3790786d4f5949493b34ae468bb08256578118"} Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.195033 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" event={"ID":"76b624fe-023f-4310-a261-9a0cd0d1c08e","Type":"ContainerStarted","Data":"b1366e1ede930a332f2739d2c06e1ac08c704a0258d9c436ac998148b955802d"} Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.213641 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.214771 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.216686 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.216786 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.216687 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2kj8q" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.217083 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.217361 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.217507 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.218445 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.226590 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302241 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302294 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302318 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302333 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302407 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kn7\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302429 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302451 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.302477 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403320 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403388 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403411 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403426 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403448 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403483 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kn7\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403500 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403517 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.403537 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.404173 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.404278 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.404731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.404861 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.408251 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.408689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.409071 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.409946 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.410063 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c94e7d6938345237d9f3b67215351818a9053e550e2de545bc1aa35e9d1a032d/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.410804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.411862 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.423146 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kn7\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.452376 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.569290 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:38:33 crc kubenswrapper[4810]: I1003 08:38:33.625096 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:38:33 crc kubenswrapper[4810]: W1003 08:38:33.646953 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b6de76_4182_4722_9844_b2dbb626c594.slice/crio-0d7efe4d120581a1f2e702f24c02484f1be8711dd5485f4a3a20c0c04426b6cb WatchSource:0}: Error finding container 0d7efe4d120581a1f2e702f24c02484f1be8711dd5485f4a3a20c0c04426b6cb: Status 404 returned error can't find the container with id 0d7efe4d120581a1f2e702f24c02484f1be8711dd5485f4a3a20c0c04426b6cb Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.225050 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerStarted","Data":"0d7efe4d120581a1f2e702f24c02484f1be8711dd5485f4a3a20c0c04426b6cb"} Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.232743 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.805039 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.806624 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.808673 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.817788 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.818422 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.818648 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.818772 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-llst8" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.821704 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.836312 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.854759 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.854832 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.854932 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.857608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.857708 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522z5\" (UniqueName: \"kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.857805 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.857879 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.859181 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.859237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.963013 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.963066 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.963173 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964102 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964150 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964183 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964205 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964226 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.964271 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522z5\" (UniqueName: \"kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.965073 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.965097 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.966118 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.968152 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.986170 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.986288 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6084f10721b62e7d9efb2ccfb9a5ff0f8975ef63d8047115e8d3b778caa25d9/globalmount\"" pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.986307 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.986348 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.986371 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:34 crc kubenswrapper[4810]: I1003 08:38:34.995951 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522z5\" (UniqueName: \"kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.028367 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") pod \"openstack-galera-0\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " pod="openstack/openstack-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.144588 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.355777 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.356937 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.357020 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.373424 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8628r" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.374174 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.381279 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.381528 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473828 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473855 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473894 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473958 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.473985 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.474048 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.474112 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.474143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnjx\" (UniqueName: \"kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.576822 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577115 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577345 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577533 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577680 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577840 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.577893 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnjx\" (UniqueName: \"kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.578789 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.579282 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.584377 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.584694 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.585449 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.585948 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.589554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.610611 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.610773 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ddc755427e78b305d6dd5ebd402a6584fcf9e83c550fdcb3b42f8b98d20a960f/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.622286 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnjx\" (UniqueName: \"kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.723380 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"openstack-cell1-galera-0\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:35 crc kubenswrapper[4810]: W1003 08:38:35.833159 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1e2c41_7cfe_4008_aa40_f1936e5d95e1.slice/crio-89756a5eb46b9627a45a57fa64505d7c68d692eadcb60d8949358847b2072291 WatchSource:0}: Error finding container 89756a5eb46b9627a45a57fa64505d7c68d692eadcb60d8949358847b2072291: Status 404 returned error can't find the container with id 89756a5eb46b9627a45a57fa64505d7c68d692eadcb60d8949358847b2072291 Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.890544 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.892875 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.895188 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.895439 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-d56cm" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.902948 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.915476 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.986023 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.986069 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm492\" (UniqueName: \"kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.986106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.986123 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.986145 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:35 crc kubenswrapper[4810]: I1003 08:38:35.992286 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.087266 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.087791 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm492\" (UniqueName: \"kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.088002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.088033 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.088057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.089899 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.090659 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.096606 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.099009 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.106564 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm492\" (UniqueName: \"kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492\") pod \"memcached-0\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.252584 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerStarted","Data":"89756a5eb46b9627a45a57fa64505d7c68d692eadcb60d8949358847b2072291"} Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.346027 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.418216 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.564169 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:38:36 crc kubenswrapper[4810]: I1003 08:38:36.607527 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:38:36 crc kubenswrapper[4810]: W1003 08:38:36.612524 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc56de889_4796_4d3c_87b0_faff35387c26.slice/crio-c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84 WatchSource:0}: Error finding container c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84: Status 404 returned error can't find the container with id c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84 Oct 03 08:38:37 crc kubenswrapper[4810]: I1003 08:38:37.263046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c56de889-4796-4d3c-87b0-faff35387c26","Type":"ContainerStarted","Data":"c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84"} Oct 03 08:38:37 crc kubenswrapper[4810]: I1003 08:38:37.271440 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerStarted","Data":"9770605882dda004ff7db2eefb6ee2e01136630ad05f1e87740139db27d2c8df"} Oct 03 08:38:37 crc kubenswrapper[4810]: I1003 08:38:37.273302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerStarted","Data":"085ecb8ca9587e269951638a2a1f72a3180c5580db17c7ce5fa8b556d397dc67"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.452987 4810 generic.go:334] "Generic (PLEG): container finished" podID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerID="07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521" exitCode=0 Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.453163 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" event={"ID":"76b624fe-023f-4310-a261-9a0cd0d1c08e","Type":"ContainerDied","Data":"07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.456499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c56de889-4796-4d3c-87b0-faff35387c26","Type":"ContainerStarted","Data":"211daa9ed516000f1909977a5746e579501a880585f3b68881c446794b0cd9ed"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.456655 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.459392 4810 generic.go:334] "Generic (PLEG): container finished" podID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerID="9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb" exitCode=0 Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.459449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" event={"ID":"93aef6e1-0bc8-4264-af86-523ecfdf9d63","Type":"ContainerDied","Data":"9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.461041 4810 generic.go:334] "Generic (PLEG): container finished" podID="a0bb6322-c010-4fca-8eb0-6e6622a349a8" containerID="2baa1e8bf9afd691548918cd97b140b0e089264964d6dd8066a9bc058d81f817" exitCode=0 Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.461080 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" event={"ID":"a0bb6322-c010-4fca-8eb0-6e6622a349a8","Type":"ContainerDied","Data":"2baa1e8bf9afd691548918cd97b140b0e089264964d6dd8066a9bc058d81f817"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.462707 4810 generic.go:334] "Generic (PLEG): container finished" podID="b9c30896-1241-4e39-a61e-0fc8feb76dde" containerID="f9125edae274621434b76b9bd7698f8667fa9e29eee2a0d63fcf9c076b236ba3" exitCode=0 Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.462806 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" event={"ID":"b9c30896-1241-4e39-a61e-0fc8feb76dde","Type":"ContainerDied","Data":"f9125edae274621434b76b9bd7698f8667fa9e29eee2a0d63fcf9c076b236ba3"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.482110 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerStarted","Data":"d7650812d29d222e124e1900773dc3db579c0939ad0977b0cc2d623d76fc78cc"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.487043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerStarted","Data":"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece"} Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.503334 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.7960122739999997 podStartE2EDuration="25.503309397s" podCreationTimestamp="2025-10-03 08:38:35 +0000 UTC" firstStartedPulling="2025-10-03 08:38:36.615940745 +0000 UTC m=+6150.043191480" lastFinishedPulling="2025-10-03 08:38:59.323237868 +0000 UTC m=+6172.750488603" observedRunningTime="2025-10-03 08:39:00.50227816 +0000 UTC m=+6173.929528895" watchObservedRunningTime="2025-10-03 08:39:00.503309397 +0000 UTC m=+6173.930560133" Oct 03 08:39:00 crc kubenswrapper[4810]: E1003 08:39:00.811055 4810 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 03 08:39:00 crc kubenswrapper[4810]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/93aef6e1-0bc8-4264-af86-523ecfdf9d63/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 08:39:00 crc kubenswrapper[4810]: > podSandboxID="502dc80e1b6b5bfd9a48240622d974fba3dded73b5b493374b0642ad16121061" Oct 03 08:39:00 crc kubenswrapper[4810]: E1003 08:39:00.811534 4810 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 03 08:39:00 crc kubenswrapper[4810]: container &Container{Name:dnsmasq-dns,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:88dc57612f447daadb492dcf3ad854ac,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5rvrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6697c69467-p9j7d_openstack(93aef6e1-0bc8-4264-af86-523ecfdf9d63): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/93aef6e1-0bc8-4264-af86-523ecfdf9d63/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 03 08:39:00 crc kubenswrapper[4810]: > logger="UnhandledError" Oct 03 08:39:00 crc kubenswrapper[4810]: E1003 08:39:00.812834 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/93aef6e1-0bc8-4264-af86-523ecfdf9d63/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.872023 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:39:00 crc kubenswrapper[4810]: I1003 08:39:00.888252 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.008277 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config\") pod \"b9c30896-1241-4e39-a61e-0fc8feb76dde\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.008347 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4qgl\" (UniqueName: \"kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl\") pod \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.008410 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88vqn\" (UniqueName: \"kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn\") pod \"b9c30896-1241-4e39-a61e-0fc8feb76dde\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.008466 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config\") pod \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\" (UID: \"a0bb6322-c010-4fca-8eb0-6e6622a349a8\") " Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.008554 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc\") pod \"b9c30896-1241-4e39-a61e-0fc8feb76dde\" (UID: \"b9c30896-1241-4e39-a61e-0fc8feb76dde\") " Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.014544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn" (OuterVolumeSpecName: "kube-api-access-88vqn") pod "b9c30896-1241-4e39-a61e-0fc8feb76dde" (UID: "b9c30896-1241-4e39-a61e-0fc8feb76dde"). InnerVolumeSpecName "kube-api-access-88vqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.014694 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl" (OuterVolumeSpecName: "kube-api-access-d4qgl") pod "a0bb6322-c010-4fca-8eb0-6e6622a349a8" (UID: "a0bb6322-c010-4fca-8eb0-6e6622a349a8"). InnerVolumeSpecName "kube-api-access-d4qgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.030621 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config" (OuterVolumeSpecName: "config") pod "b9c30896-1241-4e39-a61e-0fc8feb76dde" (UID: "b9c30896-1241-4e39-a61e-0fc8feb76dde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.032692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9c30896-1241-4e39-a61e-0fc8feb76dde" (UID: "b9c30896-1241-4e39-a61e-0fc8feb76dde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.040641 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config" (OuterVolumeSpecName: "config") pod "a0bb6322-c010-4fca-8eb0-6e6622a349a8" (UID: "a0bb6322-c010-4fca-8eb0-6e6622a349a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.111081 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.111130 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9c30896-1241-4e39-a61e-0fc8feb76dde-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.111150 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4qgl\" (UniqueName: \"kubernetes.io/projected/a0bb6322-c010-4fca-8eb0-6e6622a349a8-kube-api-access-d4qgl\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.111169 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88vqn\" (UniqueName: \"kubernetes.io/projected/b9c30896-1241-4e39-a61e-0fc8feb76dde-kube-api-access-88vqn\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.111184 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bb6322-c010-4fca-8eb0-6e6622a349a8-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.496040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerStarted","Data":"3ef733fe5152ea32cf6756073f19384a9a20fefd876c04fcc3e0b7cee21fdc3d"} Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.497593 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.497608 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55cbcdb7f9-p9pfb" event={"ID":"a0bb6322-c010-4fca-8eb0-6e6622a349a8","Type":"ContainerDied","Data":"7439b622b25db4c82921659ce1b48f7c17ec3c57182ad12885cc802535004860"} Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.497678 4810 scope.go:117] "RemoveContainer" containerID="2baa1e8bf9afd691548918cd97b140b0e089264964d6dd8066a9bc058d81f817" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.499564 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.499728 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5979bfbbf7-prj2j" event={"ID":"b9c30896-1241-4e39-a61e-0fc8feb76dde","Type":"ContainerDied","Data":"0b4cd9294e6f45a54b4e9166cb3790786d4f5949493b34ae468bb08256578118"} Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.501973 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerStarted","Data":"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e"} Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.504856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" event={"ID":"76b624fe-023f-4310-a261-9a0cd0d1c08e","Type":"ContainerStarted","Data":"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b"} Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.504907 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.518465 4810 scope.go:117] "RemoveContainer" containerID="f9125edae274621434b76b9bd7698f8667fa9e29eee2a0d63fcf9c076b236ba3" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.571037 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" podStartSLOduration=3.120669226 podStartE2EDuration="29.571019276s" podCreationTimestamp="2025-10-03 08:38:32 +0000 UTC" firstStartedPulling="2025-10-03 08:38:32.930689586 +0000 UTC m=+6146.357940321" lastFinishedPulling="2025-10-03 08:38:59.381039636 +0000 UTC m=+6172.808290371" observedRunningTime="2025-10-03 08:39:01.568948571 +0000 UTC m=+6174.996199306" watchObservedRunningTime="2025-10-03 08:39:01.571019276 +0000 UTC m=+6174.998270021" Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.615135 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.637544 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55cbcdb7f9-p9pfb"] Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.657567 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:39:01 crc kubenswrapper[4810]: I1003 08:39:01.663975 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5979bfbbf7-prj2j"] Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.088408 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.088803 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.088848 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.089483 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.089539 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326" gracePeriod=600 Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.513312 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" event={"ID":"93aef6e1-0bc8-4264-af86-523ecfdf9d63","Type":"ContainerStarted","Data":"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9"} Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.514036 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.519253 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326" exitCode=0 Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.519383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326"} Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.519438 4810 scope.go:117] "RemoveContainer" containerID="3640fcd822fa85bce4e190ba0e9534b51f638133da43d971acac7226babb9d7b" Oct 03 08:39:02 crc kubenswrapper[4810]: I1003 08:39:02.534002 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" podStartSLOduration=4.357449578 podStartE2EDuration="31.533980587s" podCreationTimestamp="2025-10-03 08:38:31 +0000 UTC" firstStartedPulling="2025-10-03 08:38:32.258218877 +0000 UTC m=+6145.685469612" lastFinishedPulling="2025-10-03 08:38:59.434749896 +0000 UTC m=+6172.862000621" observedRunningTime="2025-10-03 08:39:02.531334116 +0000 UTC m=+6175.958584851" watchObservedRunningTime="2025-10-03 08:39:02.533980587 +0000 UTC m=+6175.961231322" Oct 03 08:39:03 crc kubenswrapper[4810]: I1003 08:39:03.313150 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bb6322-c010-4fca-8eb0-6e6622a349a8" path="/var/lib/kubelet/pods/a0bb6322-c010-4fca-8eb0-6e6622a349a8/volumes" Oct 03 08:39:03 crc kubenswrapper[4810]: I1003 08:39:03.314535 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c30896-1241-4e39-a61e-0fc8feb76dde" path="/var/lib/kubelet/pods/b9c30896-1241-4e39-a61e-0fc8feb76dde/volumes" Oct 03 08:39:03 crc kubenswrapper[4810]: I1003 08:39:03.528615 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84"} Oct 03 08:39:06 crc kubenswrapper[4810]: I1003 08:39:06.348434 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 08:39:06 crc kubenswrapper[4810]: I1003 08:39:06.639988 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.390943 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.442331 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.574172 4810 generic.go:334] "Generic (PLEG): container finished" podID="51261961-7cea-4ee8-8009-d0669f796caa" containerID="d7650812d29d222e124e1900773dc3db579c0939ad0977b0cc2d623d76fc78cc" exitCode=0 Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.574254 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerDied","Data":"d7650812d29d222e124e1900773dc3db579c0939ad0977b0cc2d623d76fc78cc"} Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.576674 4810 generic.go:334] "Generic (PLEG): container finished" podID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerID="62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece" exitCode=0 Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.577032 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="dnsmasq-dns" containerID="cri-o://b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9" gracePeriod=10 Oct 03 08:39:07 crc kubenswrapper[4810]: I1003 08:39:07.577544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerDied","Data":"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece"} Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.031571 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.153956 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rvrr\" (UniqueName: \"kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr\") pod \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.154061 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc\") pod \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.154117 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config\") pod \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\" (UID: \"93aef6e1-0bc8-4264-af86-523ecfdf9d63\") " Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.159431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr" (OuterVolumeSpecName: "kube-api-access-5rvrr") pod "93aef6e1-0bc8-4264-af86-523ecfdf9d63" (UID: "93aef6e1-0bc8-4264-af86-523ecfdf9d63"). InnerVolumeSpecName "kube-api-access-5rvrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.191021 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config" (OuterVolumeSpecName: "config") pod "93aef6e1-0bc8-4264-af86-523ecfdf9d63" (UID: "93aef6e1-0bc8-4264-af86-523ecfdf9d63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.193153 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93aef6e1-0bc8-4264-af86-523ecfdf9d63" (UID: "93aef6e1-0bc8-4264-af86-523ecfdf9d63"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.255468 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rvrr\" (UniqueName: \"kubernetes.io/projected/93aef6e1-0bc8-4264-af86-523ecfdf9d63-kube-api-access-5rvrr\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.255502 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.255514 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93aef6e1-0bc8-4264-af86-523ecfdf9d63-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.584874 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerStarted","Data":"cbb2088da894702fc6a881a5dab4338b4a03c2b8469ca40cdbf5ae5f3ce22b40"} Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.588219 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerStarted","Data":"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8"} Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.590371 4810 generic.go:334] "Generic (PLEG): container finished" podID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerID="b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9" exitCode=0 Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.590402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" event={"ID":"93aef6e1-0bc8-4264-af86-523ecfdf9d63","Type":"ContainerDied","Data":"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9"} Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.590421 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" event={"ID":"93aef6e1-0bc8-4264-af86-523ecfdf9d63","Type":"ContainerDied","Data":"502dc80e1b6b5bfd9a48240622d974fba3dded73b5b493374b0642ad16121061"} Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.590436 4810 scope.go:117] "RemoveContainer" containerID="b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.590441 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6697c69467-p9j7d" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.613680 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=11.807625989 podStartE2EDuration="34.61365645s" podCreationTimestamp="2025-10-03 08:38:34 +0000 UTC" firstStartedPulling="2025-10-03 08:38:36.589198913 +0000 UTC m=+6150.016449648" lastFinishedPulling="2025-10-03 08:38:59.395229374 +0000 UTC m=+6172.822480109" observedRunningTime="2025-10-03 08:39:08.611958275 +0000 UTC m=+6182.039209030" watchObservedRunningTime="2025-10-03 08:39:08.61365645 +0000 UTC m=+6182.040907185" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.633028 4810 scope.go:117] "RemoveContainer" containerID="9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.646139 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=12.735163440000001 podStartE2EDuration="35.646115934s" podCreationTimestamp="2025-10-03 08:38:33 +0000 UTC" firstStartedPulling="2025-10-03 08:38:36.465279225 +0000 UTC m=+6149.892529960" lastFinishedPulling="2025-10-03 08:38:59.376231719 +0000 UTC m=+6172.803482454" observedRunningTime="2025-10-03 08:39:08.639351434 +0000 UTC m=+6182.066602169" watchObservedRunningTime="2025-10-03 08:39:08.646115934 +0000 UTC m=+6182.073366679" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.660207 4810 scope.go:117] "RemoveContainer" containerID="b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9" Oct 03 08:39:08 crc kubenswrapper[4810]: E1003 08:39:08.661285 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9\": container with ID starting with b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9 not found: ID does not exist" containerID="b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.661333 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9"} err="failed to get container status \"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9\": rpc error: code = NotFound desc = could not find container \"b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9\": container with ID starting with b8e83833828d5125047ff7f83a1843fc10827cb53329498f01491f1421e932a9 not found: ID does not exist" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.661369 4810 scope.go:117] "RemoveContainer" containerID="9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb" Oct 03 08:39:08 crc kubenswrapper[4810]: E1003 08:39:08.661667 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb\": container with ID starting with 9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb not found: ID does not exist" containerID="9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.661696 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb"} err="failed to get container status \"9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb\": rpc error: code = NotFound desc = could not find container \"9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb\": container with ID starting with 9fb8088906ad75177cf6b1c117666914624d16b8d32dbcf22d85c4c69a1fbfcb not found: ID does not exist" Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.662341 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:39:08 crc kubenswrapper[4810]: I1003 08:39:08.669820 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6697c69467-p9j7d"] Oct 03 08:39:09 crc kubenswrapper[4810]: I1003 08:39:09.313857 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" path="/var/lib/kubelet/pods/93aef6e1-0bc8-4264-af86-523ecfdf9d63/volumes" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.146715 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.148784 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.196107 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.694591 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.993237 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 08:39:15 crc kubenswrapper[4810]: I1003 08:39:15.993285 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 08:39:16 crc kubenswrapper[4810]: I1003 08:39:16.045546 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 08:39:16 crc kubenswrapper[4810]: I1003 08:39:16.694293 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.107117 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:30 crc kubenswrapper[4810]: E1003 08:39:30.107957 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bb6322-c010-4fca-8eb0-6e6622a349a8" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.107968 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bb6322-c010-4fca-8eb0-6e6622a349a8" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: E1003 08:39:30.107985 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="dnsmasq-dns" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.107991 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="dnsmasq-dns" Oct 03 08:39:30 crc kubenswrapper[4810]: E1003 08:39:30.108001 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.108007 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: E1003 08:39:30.108024 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c30896-1241-4e39-a61e-0fc8feb76dde" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.108029 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c30896-1241-4e39-a61e-0fc8feb76dde" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.108190 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bb6322-c010-4fca-8eb0-6e6622a349a8" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.108202 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c30896-1241-4e39-a61e-0fc8feb76dde" containerName="init" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.108219 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="93aef6e1-0bc8-4264-af86-523ecfdf9d63" containerName="dnsmasq-dns" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.109292 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.131072 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.204451 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.204555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzvkx\" (UniqueName: \"kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.204662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.306129 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.306450 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzvkx\" (UniqueName: \"kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.306601 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.306862 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.307080 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.331186 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzvkx\" (UniqueName: \"kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx\") pod \"redhat-operators-hdrjx\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.430592 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:30 crc kubenswrapper[4810]: I1003 08:39:30.875310 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:30 crc kubenswrapper[4810]: W1003 08:39:30.879832 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1004a7c_61cc_4238_93d5_280a19cceca0.slice/crio-2337cecef8bce2c230fba0b49a85056f79f192cd8c0b7c8d903f1a8c99cae9d4 WatchSource:0}: Error finding container 2337cecef8bce2c230fba0b49a85056f79f192cd8c0b7c8d903f1a8c99cae9d4: Status 404 returned error can't find the container with id 2337cecef8bce2c230fba0b49a85056f79f192cd8c0b7c8d903f1a8c99cae9d4 Oct 03 08:39:31 crc kubenswrapper[4810]: I1003 08:39:31.777868 4810 generic.go:334] "Generic (PLEG): container finished" podID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerID="73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d" exitCode=0 Oct 03 08:39:31 crc kubenswrapper[4810]: I1003 08:39:31.777985 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerDied","Data":"73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d"} Oct 03 08:39:31 crc kubenswrapper[4810]: I1003 08:39:31.778398 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerStarted","Data":"2337cecef8bce2c230fba0b49a85056f79f192cd8c0b7c8d903f1a8c99cae9d4"} Oct 03 08:39:32 crc kubenswrapper[4810]: I1003 08:39:32.788656 4810 generic.go:334] "Generic (PLEG): container finished" podID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerID="3ef733fe5152ea32cf6756073f19384a9a20fefd876c04fcc3e0b7cee21fdc3d" exitCode=0 Oct 03 08:39:32 crc kubenswrapper[4810]: I1003 08:39:32.788743 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerDied","Data":"3ef733fe5152ea32cf6756073f19384a9a20fefd876c04fcc3e0b7cee21fdc3d"} Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.801106 4810 generic.go:334] "Generic (PLEG): container finished" podID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerID="94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e" exitCode=0 Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.801639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerDied","Data":"94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e"} Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.810115 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerStarted","Data":"4313df1b61643f6f9823e4d48d5da7d0f9852ef05be70a31730f5b4dc7f1db1d"} Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.810441 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.812995 4810 generic.go:334] "Generic (PLEG): container finished" podID="e2b6de76-4182-4722-9844-b2dbb626c594" containerID="40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e" exitCode=0 Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.813071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerDied","Data":"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e"} Oct 03 08:39:33 crc kubenswrapper[4810]: I1003 08:39:33.916847 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.38125422 podStartE2EDuration="1m1.916809708s" podCreationTimestamp="2025-10-03 08:38:32 +0000 UTC" firstStartedPulling="2025-10-03 08:38:35.84254693 +0000 UTC m=+6149.269797665" lastFinishedPulling="2025-10-03 08:38:59.378102418 +0000 UTC m=+6172.805353153" observedRunningTime="2025-10-03 08:39:33.90784861 +0000 UTC m=+6207.335099365" watchObservedRunningTime="2025-10-03 08:39:33.916809708 +0000 UTC m=+6207.344060443" Oct 03 08:39:34 crc kubenswrapper[4810]: I1003 08:39:34.825972 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerStarted","Data":"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a"} Oct 03 08:39:34 crc kubenswrapper[4810]: I1003 08:39:34.828404 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerStarted","Data":"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318"} Oct 03 08:39:34 crc kubenswrapper[4810]: I1003 08:39:34.852171 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hdrjx" podStartSLOduration=2.3197928 podStartE2EDuration="4.852141213s" podCreationTimestamp="2025-10-03 08:39:30 +0000 UTC" firstStartedPulling="2025-10-03 08:39:31.780214369 +0000 UTC m=+6205.207465104" lastFinishedPulling="2025-10-03 08:39:34.312562772 +0000 UTC m=+6207.739813517" observedRunningTime="2025-10-03 08:39:34.842744803 +0000 UTC m=+6208.269995538" watchObservedRunningTime="2025-10-03 08:39:34.852141213 +0000 UTC m=+6208.279391948" Oct 03 08:39:34 crc kubenswrapper[4810]: I1003 08:39:34.895573 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.165170222 podStartE2EDuration="1m3.895547389s" podCreationTimestamp="2025-10-03 08:38:31 +0000 UTC" firstStartedPulling="2025-10-03 08:38:33.652976231 +0000 UTC m=+6147.080226966" lastFinishedPulling="2025-10-03 08:38:59.383353398 +0000 UTC m=+6172.810604133" observedRunningTime="2025-10-03 08:39:34.881938746 +0000 UTC m=+6208.309189481" watchObservedRunningTime="2025-10-03 08:39:34.895547389 +0000 UTC m=+6208.322798134" Oct 03 08:39:40 crc kubenswrapper[4810]: I1003 08:39:40.431801 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:40 crc kubenswrapper[4810]: I1003 08:39:40.432332 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:40 crc kubenswrapper[4810]: I1003 08:39:40.472119 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:40 crc kubenswrapper[4810]: I1003 08:39:40.947040 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:41 crc kubenswrapper[4810]: I1003 08:39:41.001026 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:42 crc kubenswrapper[4810]: I1003 08:39:42.904562 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hdrjx" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="registry-server" containerID="cri-o://524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a" gracePeriod=2 Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.172140 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.363393 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.428720 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content\") pod \"f1004a7c-61cc-4238-93d5-280a19cceca0\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.428794 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzvkx\" (UniqueName: \"kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx\") pod \"f1004a7c-61cc-4238-93d5-280a19cceca0\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.428920 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities\") pod \"f1004a7c-61cc-4238-93d5-280a19cceca0\" (UID: \"f1004a7c-61cc-4238-93d5-280a19cceca0\") " Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.430202 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities" (OuterVolumeSpecName: "utilities") pod "f1004a7c-61cc-4238-93d5-280a19cceca0" (UID: "f1004a7c-61cc-4238-93d5-280a19cceca0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.431035 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.435810 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx" (OuterVolumeSpecName: "kube-api-access-xzvkx") pod "f1004a7c-61cc-4238-93d5-280a19cceca0" (UID: "f1004a7c-61cc-4238-93d5-280a19cceca0"). InnerVolumeSpecName "kube-api-access-xzvkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.510609 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1004a7c-61cc-4238-93d5-280a19cceca0" (UID: "f1004a7c-61cc-4238-93d5-280a19cceca0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.532811 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1004a7c-61cc-4238-93d5-280a19cceca0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.532857 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzvkx\" (UniqueName: \"kubernetes.io/projected/f1004a7c-61cc-4238-93d5-280a19cceca0-kube-api-access-xzvkx\") on node \"crc\" DevicePath \"\"" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.571295 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.254:5671: connect: connection refused" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.914000 4810 generic.go:334] "Generic (PLEG): container finished" podID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerID="524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a" exitCode=0 Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.914056 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerDied","Data":"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a"} Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.914067 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hdrjx" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.914096 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hdrjx" event={"ID":"f1004a7c-61cc-4238-93d5-280a19cceca0","Type":"ContainerDied","Data":"2337cecef8bce2c230fba0b49a85056f79f192cd8c0b7c8d903f1a8c99cae9d4"} Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.914122 4810 scope.go:117] "RemoveContainer" containerID="524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.934519 4810 scope.go:117] "RemoveContainer" containerID="94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.940987 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.947430 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hdrjx"] Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.968740 4810 scope.go:117] "RemoveContainer" containerID="73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.987658 4810 scope.go:117] "RemoveContainer" containerID="524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a" Oct 03 08:39:43 crc kubenswrapper[4810]: E1003 08:39:43.988246 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a\": container with ID starting with 524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a not found: ID does not exist" containerID="524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.988292 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a"} err="failed to get container status \"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a\": rpc error: code = NotFound desc = could not find container \"524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a\": container with ID starting with 524744bcbc46b14e57455d2f86f2c86afd86ef8a7d8fca570871d28c23947b7a not found: ID does not exist" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.988325 4810 scope.go:117] "RemoveContainer" containerID="94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e" Oct 03 08:39:43 crc kubenswrapper[4810]: E1003 08:39:43.988647 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e\": container with ID starting with 94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e not found: ID does not exist" containerID="94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.988673 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e"} err="failed to get container status \"94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e\": rpc error: code = NotFound desc = could not find container \"94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e\": container with ID starting with 94e41d6a241ebc65bf6c40971385fccd867e26d18d8387772fe373cfe5a1c61e not found: ID does not exist" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.988687 4810 scope.go:117] "RemoveContainer" containerID="73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d" Oct 03 08:39:43 crc kubenswrapper[4810]: E1003 08:39:43.988995 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d\": container with ID starting with 73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d not found: ID does not exist" containerID="73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d" Oct 03 08:39:43 crc kubenswrapper[4810]: I1003 08:39:43.989019 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d"} err="failed to get container status \"73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d\": rpc error: code = NotFound desc = could not find container \"73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d\": container with ID starting with 73e5ff390c192a1df68130fecd13d4de7249dc1619039155c796811f396af55d not found: ID does not exist" Oct 03 08:39:45 crc kubenswrapper[4810]: I1003 08:39:45.313851 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" path="/var/lib/kubelet/pods/f1004a7c-61cc-4238-93d5-280a19cceca0/volumes" Oct 03 08:39:53 crc kubenswrapper[4810]: I1003 08:39:53.176147 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:39:53 crc kubenswrapper[4810]: I1003 08:39:53.571068 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.878987 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:39:59 crc kubenswrapper[4810]: E1003 08:39:59.879946 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="extract-utilities" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.879966 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="extract-utilities" Oct 03 08:39:59 crc kubenswrapper[4810]: E1003 08:39:59.879989 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="extract-content" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.879997 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="extract-content" Oct 03 08:39:59 crc kubenswrapper[4810]: E1003 08:39:59.880017 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="registry-server" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.880027 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="registry-server" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.880249 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1004a7c-61cc-4238-93d5-280a19cceca0" containerName="registry-server" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.881259 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.897375 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.991138 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvr9d\" (UniqueName: \"kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.991200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:39:59 crc kubenswrapper[4810]: I1003 08:39:59.991221 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.092409 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvr9d\" (UniqueName: \"kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.092508 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.092536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.093490 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.093602 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.121458 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvr9d\" (UniqueName: \"kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d\") pod \"dnsmasq-dns-96d5866c7-z65nq\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.197599 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.616165 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:00 crc kubenswrapper[4810]: I1003 08:40:00.634857 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:40:01 crc kubenswrapper[4810]: I1003 08:40:01.053032 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerID="e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473" exitCode=0 Oct 03 08:40:01 crc kubenswrapper[4810]: I1003 08:40:01.053084 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" event={"ID":"9a8db38e-fd2f-478f-97c7-52e6fb134f80","Type":"ContainerDied","Data":"e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473"} Oct 03 08:40:01 crc kubenswrapper[4810]: I1003 08:40:01.053117 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" event={"ID":"9a8db38e-fd2f-478f-97c7-52e6fb134f80","Type":"ContainerStarted","Data":"100ab376fbe543d7d122170c95566aba8db2f7537820fb203ad9c1d48d4891ed"} Oct 03 08:40:01 crc kubenswrapper[4810]: I1003 08:40:01.276779 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:02 crc kubenswrapper[4810]: I1003 08:40:02.063835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" event={"ID":"9a8db38e-fd2f-478f-97c7-52e6fb134f80","Type":"ContainerStarted","Data":"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4"} Oct 03 08:40:02 crc kubenswrapper[4810]: I1003 08:40:02.064202 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:02 crc kubenswrapper[4810]: I1003 08:40:02.082216 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" podStartSLOduration=3.082199392 podStartE2EDuration="3.082199392s" podCreationTimestamp="2025-10-03 08:39:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:40:02.078435043 +0000 UTC m=+6235.505685788" watchObservedRunningTime="2025-10-03 08:40:02.082199392 +0000 UTC m=+6235.509450117" Oct 03 08:40:04 crc kubenswrapper[4810]: I1003 08:40:04.838361 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="rabbitmq" containerID="cri-o://4313df1b61643f6f9823e4d48d5da7d0f9852ef05be70a31730f5b4dc7f1db1d" gracePeriod=604796 Oct 03 08:40:05 crc kubenswrapper[4810]: I1003 08:40:05.177014 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="rabbitmq" containerID="cri-o://b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318" gracePeriod=604797 Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.199856 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.252157 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.252387 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="dnsmasq-dns" containerID="cri-o://7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b" gracePeriod=10 Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.666187 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.746542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc\") pod \"76b624fe-023f-4310-a261-9a0cd0d1c08e\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.746683 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn4v5\" (UniqueName: \"kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5\") pod \"76b624fe-023f-4310-a261-9a0cd0d1c08e\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.746750 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config\") pod \"76b624fe-023f-4310-a261-9a0cd0d1c08e\" (UID: \"76b624fe-023f-4310-a261-9a0cd0d1c08e\") " Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.751516 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5" (OuterVolumeSpecName: "kube-api-access-mn4v5") pod "76b624fe-023f-4310-a261-9a0cd0d1c08e" (UID: "76b624fe-023f-4310-a261-9a0cd0d1c08e"). InnerVolumeSpecName "kube-api-access-mn4v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.778736 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config" (OuterVolumeSpecName: "config") pod "76b624fe-023f-4310-a261-9a0cd0d1c08e" (UID: "76b624fe-023f-4310-a261-9a0cd0d1c08e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.779516 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "76b624fe-023f-4310-a261-9a0cd0d1c08e" (UID: "76b624fe-023f-4310-a261-9a0cd0d1c08e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.848072 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.848104 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/76b624fe-023f-4310-a261-9a0cd0d1c08e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:10 crc kubenswrapper[4810]: I1003 08:40:10.848114 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn4v5\" (UniqueName: \"kubernetes.io/projected/76b624fe-023f-4310-a261-9a0cd0d1c08e-kube-api-access-mn4v5\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.135376 4810 generic.go:334] "Generic (PLEG): container finished" podID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerID="7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b" exitCode=0 Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.135449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" event={"ID":"76b624fe-023f-4310-a261-9a0cd0d1c08e","Type":"ContainerDied","Data":"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b"} Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.135464 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.135490 4810 scope.go:117] "RemoveContainer" containerID="7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.135479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cfd7b4c45-7kcb8" event={"ID":"76b624fe-023f-4310-a261-9a0cd0d1c08e","Type":"ContainerDied","Data":"b1366e1ede930a332f2739d2c06e1ac08c704a0258d9c436ac998148b955802d"} Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.137808 4810 generic.go:334] "Generic (PLEG): container finished" podID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerID="4313df1b61643f6f9823e4d48d5da7d0f9852ef05be70a31730f5b4dc7f1db1d" exitCode=0 Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.137855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerDied","Data":"4313df1b61643f6f9823e4d48d5da7d0f9852ef05be70a31730f5b4dc7f1db1d"} Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.162828 4810 scope.go:117] "RemoveContainer" containerID="07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.171001 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.177061 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cfd7b4c45-7kcb8"] Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.199720 4810 scope.go:117] "RemoveContainer" containerID="7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b" Oct 03 08:40:11 crc kubenswrapper[4810]: E1003 08:40:11.200041 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b\": container with ID starting with 7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b not found: ID does not exist" containerID="7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.200078 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b"} err="failed to get container status \"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b\": rpc error: code = NotFound desc = could not find container \"7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b\": container with ID starting with 7003d6b175d42ce0cd9b028b34885fad573826b1dfc9870fd0adaec52b89463b not found: ID does not exist" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.200099 4810 scope.go:117] "RemoveContainer" containerID="07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521" Oct 03 08:40:11 crc kubenswrapper[4810]: E1003 08:40:11.200315 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521\": container with ID starting with 07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521 not found: ID does not exist" containerID="07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.200337 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521"} err="failed to get container status \"07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521\": rpc error: code = NotFound desc = could not find container \"07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521\": container with ID starting with 07b6b44ef26eb9401c0b18a0901cbb9c3cd50a5afd4c233f449309b9c9c61521 not found: ID does not exist" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.313646 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" path="/var/lib/kubelet/pods/76b624fe-023f-4310-a261-9a0cd0d1c08e/volumes" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.359540 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456238 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kn7\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456290 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456310 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456328 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456345 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456369 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456405 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456565 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.456583 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data\") pod \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\" (UID: \"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.457353 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.459081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.460099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.462215 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.464377 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.464527 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info" (OuterVolumeSpecName: "pod-info") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.467202 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7" (OuterVolumeSpecName: "kube-api-access-59kn7") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "kube-api-access-59kn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.468733 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0" (OuterVolumeSpecName: "persistence") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.484010 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data" (OuterVolumeSpecName: "config-data") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.493832 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf" (OuterVolumeSpecName: "server-conf") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.538295 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" (UID: "6d1e2c41-7cfe-4008-aa40-f1936e5d95e1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557696 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kn7\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-kube-api-access-59kn7\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557730 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557741 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557751 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557762 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557773 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557783 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557793 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557827 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") on node \"crc\" " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557841 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.557852 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.586603 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.586764 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0") on node "crc" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.659405 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.754885 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760588 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760618 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760640 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760655 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760681 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.760704 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.761364 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.761396 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.761407 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.764137 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info" (OuterVolumeSpecName: "pod-info") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.807201 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf" (OuterVolumeSpecName: "server-conf") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.858073 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862245 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862401 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862515 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fthr\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret\") pod \"e2b6de76-4182-4722-9844-b2dbb626c594\" (UID: \"e2b6de76-4182-4722-9844-b2dbb626c594\") " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862832 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e2b6de76-4182-4722-9844-b2dbb626c594-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862849 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862865 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862876 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862887 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.862902 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.864885 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.866189 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.866727 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr" (OuterVolumeSpecName: "kube-api-access-8fthr") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "kube-api-access-8fthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.873404 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180" (OuterVolumeSpecName: "persistence") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.885431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data" (OuterVolumeSpecName: "config-data") pod "e2b6de76-4182-4722-9844-b2dbb626c594" (UID: "e2b6de76-4182-4722-9844-b2dbb626c594"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.963792 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fthr\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-kube-api-access-8fthr\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.963863 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e2b6de76-4182-4722-9844-b2dbb626c594-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.963875 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e2b6de76-4182-4722-9844-b2dbb626c594-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.963950 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") on node \"crc\" " Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.963962 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b6de76-4182-4722-9844-b2dbb626c594-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.982265 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 08:40:11 crc kubenswrapper[4810]: I1003 08:40:11.982422 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180") on node "crc" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.065557 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.146594 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.146612 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6d1e2c41-7cfe-4008-aa40-f1936e5d95e1","Type":"ContainerDied","Data":"89756a5eb46b9627a45a57fa64505d7c68d692eadcb60d8949358847b2072291"} Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.146657 4810 scope.go:117] "RemoveContainer" containerID="4313df1b61643f6f9823e4d48d5da7d0f9852ef05be70a31730f5b4dc7f1db1d" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.149644 4810 generic.go:334] "Generic (PLEG): container finished" podID="e2b6de76-4182-4722-9844-b2dbb626c594" containerID="b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318" exitCode=0 Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.149700 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.149714 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerDied","Data":"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318"} Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.149742 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e2b6de76-4182-4722-9844-b2dbb626c594","Type":"ContainerDied","Data":"0d7efe4d120581a1f2e702f24c02484f1be8711dd5485f4a3a20c0c04426b6cb"} Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.166668 4810 scope.go:117] "RemoveContainer" containerID="3ef733fe5152ea32cf6756073f19384a9a20fefd876c04fcc3e0b7cee21fdc3d" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.188695 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.190935 4810 scope.go:117] "RemoveContainer" containerID="b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.196786 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.204168 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.214160 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.219896 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220218 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220230 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220244 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220250 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220264 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="init" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220271 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="init" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220283 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="setup-container" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220290 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="setup-container" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220323 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="setup-container" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220332 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="setup-container" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.220350 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="dnsmasq-dns" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220360 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="dnsmasq-dns" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220409 4810 scope.go:117] "RemoveContainer" containerID="40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220513 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220529 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="76b624fe-023f-4310-a261-9a0cd0d1c08e" containerName="dnsmasq-dns" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.220540 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" containerName="rabbitmq" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.224124 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.226328 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.227673 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.227688 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.227989 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.228082 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.228210 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-59wtf" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.232391 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.242621 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.254032 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.254173 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.254243 4810 scope.go:117] "RemoveContainer" containerID="b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.254702 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318\": container with ID starting with b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318 not found: ID does not exist" containerID="b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.254777 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318"} err="failed to get container status \"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318\": rpc error: code = NotFound desc = could not find container \"b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318\": container with ID starting with b5bd1585579aebc0a46eb0699f8f079e2bf0beb30bb3ac8c0c01649e4b59c318 not found: ID does not exist" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.254808 4810 scope.go:117] "RemoveContainer" containerID="40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e" Oct 03 08:40:12 crc kubenswrapper[4810]: E1003 08:40:12.255573 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e\": container with ID starting with 40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e not found: ID does not exist" containerID="40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.255640 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e"} err="failed to get container status \"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e\": rpc error: code = NotFound desc = could not find container \"40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e\": container with ID starting with 40159dba01b4a08c42363b50397451439e2487af39743ab43a3cf6a00c79313e not found: ID does not exist" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.256415 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.259542 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.260146 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.260244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2kj8q" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.260289 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.260351 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.260575 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.264313 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370312 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370362 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370414 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370436 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370459 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsl8h\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370478 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370498 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370515 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370536 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370560 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370665 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370752 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370836 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.370947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371036 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371068 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371119 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371190 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371217 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371293 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.371310 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt2s\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473110 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473161 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473198 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt2s\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473252 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473286 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473313 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473329 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473347 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsl8h\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473444 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473462 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473476 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473493 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473511 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473574 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473616 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.473644 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.475703 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.476121 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.476992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.477061 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.477387 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.478341 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.478440 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.478464 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.478689 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.481672 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.481978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.482422 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.482435 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.482461 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c193bc2f7c1b709308b46ace7f1403c4c72a6486c542920bfcbeb5c8bb233cde/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.482602 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.483864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.484172 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.484739 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.484768 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c94e7d6938345237d9f3b67215351818a9053e550e2de545bc1aa35e9d1a032d/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.484805 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.486074 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.492273 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.496182 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsl8h\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.497130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt2s\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.511105 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"rabbitmq-cell1-server-0\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.516940 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"rabbitmq-server-0\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.551273 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.582366 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:40:12 crc kubenswrapper[4810]: I1003 08:40:12.839216 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:40:13 crc kubenswrapper[4810]: I1003 08:40:13.017569 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:40:13 crc kubenswrapper[4810]: W1003 08:40:13.018178 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9436321_8243_4a4a_b07a_b14668a07f1f.slice/crio-46f8e2d9a7f0ec2ad3caba55ca139b608a726f8949b644c7431e5c40575e10e5 WatchSource:0}: Error finding container 46f8e2d9a7f0ec2ad3caba55ca139b608a726f8949b644c7431e5c40575e10e5: Status 404 returned error can't find the container with id 46f8e2d9a7f0ec2ad3caba55ca139b608a726f8949b644c7431e5c40575e10e5 Oct 03 08:40:13 crc kubenswrapper[4810]: I1003 08:40:13.167656 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerStarted","Data":"46f8e2d9a7f0ec2ad3caba55ca139b608a726f8949b644c7431e5c40575e10e5"} Oct 03 08:40:13 crc kubenswrapper[4810]: I1003 08:40:13.169820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerStarted","Data":"a3f7f18a094e7d135f475e1d019258020270f8f6abe77f3020c3cdfa60c2d04f"} Oct 03 08:40:13 crc kubenswrapper[4810]: I1003 08:40:13.313654 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d1e2c41-7cfe-4008-aa40-f1936e5d95e1" path="/var/lib/kubelet/pods/6d1e2c41-7cfe-4008-aa40-f1936e5d95e1/volumes" Oct 03 08:40:13 crc kubenswrapper[4810]: I1003 08:40:13.314702 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b6de76-4182-4722-9844-b2dbb626c594" path="/var/lib/kubelet/pods/e2b6de76-4182-4722-9844-b2dbb626c594/volumes" Oct 03 08:40:14 crc kubenswrapper[4810]: I1003 08:40:14.178880 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerStarted","Data":"69c78926137ade89dbe6277076f566f6eb2cbdab8b9d13d7f01e5c19bd597377"} Oct 03 08:40:14 crc kubenswrapper[4810]: I1003 08:40:14.181113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerStarted","Data":"4c78e1c7563ba76c46be5ea7699ccecaad95d9a403d18c473a5a4f7527d3cda2"} Oct 03 08:40:46 crc kubenswrapper[4810]: I1003 08:40:46.414416 4810 generic.go:334] "Generic (PLEG): container finished" podID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerID="4c78e1c7563ba76c46be5ea7699ccecaad95d9a403d18c473a5a4f7527d3cda2" exitCode=0 Oct 03 08:40:46 crc kubenswrapper[4810]: I1003 08:40:46.415044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerDied","Data":"4c78e1c7563ba76c46be5ea7699ccecaad95d9a403d18c473a5a4f7527d3cda2"} Oct 03 08:40:46 crc kubenswrapper[4810]: I1003 08:40:46.425492 4810 generic.go:334] "Generic (PLEG): container finished" podID="849f27b3-b111-4722-9e85-528f2fbed78d" containerID="69c78926137ade89dbe6277076f566f6eb2cbdab8b9d13d7f01e5c19bd597377" exitCode=0 Oct 03 08:40:46 crc kubenswrapper[4810]: I1003 08:40:46.425536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerDied","Data":"69c78926137ade89dbe6277076f566f6eb2cbdab8b9d13d7f01e5c19bd597377"} Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.434018 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerStarted","Data":"3e4b15b2571b4b9e95c498a522370b026aa306f3f2b72b93a885f270f401f309"} Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.434738 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.436104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerStarted","Data":"5b4764a5166f67c1b430dfc61b76795284f3ddc58b5b1b1ce0e10d2c554b70f0"} Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.436770 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.459371 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.459353207 podStartE2EDuration="35.459353207s" podCreationTimestamp="2025-10-03 08:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:40:47.455076203 +0000 UTC m=+6280.882326958" watchObservedRunningTime="2025-10-03 08:40:47.459353207 +0000 UTC m=+6280.886603942" Oct 03 08:40:47 crc kubenswrapper[4810]: I1003 08:40:47.490030 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.490007973 podStartE2EDuration="35.490007973s" podCreationTimestamp="2025-10-03 08:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:40:47.482223737 +0000 UTC m=+6280.909474482" watchObservedRunningTime="2025-10-03 08:40:47.490007973 +0000 UTC m=+6280.917258708" Oct 03 08:41:02 crc kubenswrapper[4810]: I1003 08:41:02.560235 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:41:02 crc kubenswrapper[4810]: I1003 08:41:02.585120 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.737824 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.739401 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.741420 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z9qrt" Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.755646 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.879235 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwqqn\" (UniqueName: \"kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn\") pod \"mariadb-client-1-default\" (UID: \"63326aeb-5c28-47e3-b502-c07b522e240d\") " pod="openstack/mariadb-client-1-default" Oct 03 08:41:03 crc kubenswrapper[4810]: I1003 08:41:03.980827 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwqqn\" (UniqueName: \"kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn\") pod \"mariadb-client-1-default\" (UID: \"63326aeb-5c28-47e3-b502-c07b522e240d\") " pod="openstack/mariadb-client-1-default" Oct 03 08:41:04 crc kubenswrapper[4810]: I1003 08:41:04.016339 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwqqn\" (UniqueName: \"kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn\") pod \"mariadb-client-1-default\" (UID: \"63326aeb-5c28-47e3-b502-c07b522e240d\") " pod="openstack/mariadb-client-1-default" Oct 03 08:41:04 crc kubenswrapper[4810]: I1003 08:41:04.063687 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 08:41:04 crc kubenswrapper[4810]: I1003 08:41:04.575386 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 08:41:05 crc kubenswrapper[4810]: I1003 08:41:05.585962 4810 generic.go:334] "Generic (PLEG): container finished" podID="63326aeb-5c28-47e3-b502-c07b522e240d" containerID="d4cf2798385c717dc03871c73a2647743887667eaff6173d2460a4b8db5d6be4" exitCode=0 Oct 03 08:41:05 crc kubenswrapper[4810]: I1003 08:41:05.586010 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"63326aeb-5c28-47e3-b502-c07b522e240d","Type":"ContainerDied","Data":"d4cf2798385c717dc03871c73a2647743887667eaff6173d2460a4b8db5d6be4"} Oct 03 08:41:05 crc kubenswrapper[4810]: I1003 08:41:05.586286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"63326aeb-5c28-47e3-b502-c07b522e240d","Type":"ContainerStarted","Data":"4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc"} Oct 03 08:41:06 crc kubenswrapper[4810]: I1003 08:41:06.954969 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 08:41:06 crc kubenswrapper[4810]: I1003 08:41:06.984485 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_63326aeb-5c28-47e3-b502-c07b522e240d/mariadb-client-1-default/0.log" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.010132 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.015279 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.134081 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwqqn\" (UniqueName: \"kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn\") pod \"63326aeb-5c28-47e3-b502-c07b522e240d\" (UID: \"63326aeb-5c28-47e3-b502-c07b522e240d\") " Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.142238 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn" (OuterVolumeSpecName: "kube-api-access-vwqqn") pod "63326aeb-5c28-47e3-b502-c07b522e240d" (UID: "63326aeb-5c28-47e3-b502-c07b522e240d"). InnerVolumeSpecName "kube-api-access-vwqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.236134 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwqqn\" (UniqueName: \"kubernetes.io/projected/63326aeb-5c28-47e3-b502-c07b522e240d-kube-api-access-vwqqn\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.315507 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63326aeb-5c28-47e3-b502-c07b522e240d" path="/var/lib/kubelet/pods/63326aeb-5c28-47e3-b502-c07b522e240d/volumes" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.404455 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 08:41:07 crc kubenswrapper[4810]: E1003 08:41:07.405258 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63326aeb-5c28-47e3-b502-c07b522e240d" containerName="mariadb-client-1-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.405354 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="63326aeb-5c28-47e3-b502-c07b522e240d" containerName="mariadb-client-1-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.405629 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="63326aeb-5c28-47e3-b502-c07b522e240d" containerName="mariadb-client-1-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.406332 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.412813 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.540385 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndvx\" (UniqueName: \"kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx\") pod \"mariadb-client-2-default\" (UID: \"46e407a9-aaf6-4f2a-b67c-10693e731b62\") " pod="openstack/mariadb-client-2-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.606191 4810 scope.go:117] "RemoveContainer" containerID="d4cf2798385c717dc03871c73a2647743887667eaff6173d2460a4b8db5d6be4" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.606281 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.642305 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndvx\" (UniqueName: \"kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx\") pod \"mariadb-client-2-default\" (UID: \"46e407a9-aaf6-4f2a-b67c-10693e731b62\") " pod="openstack/mariadb-client-2-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.660810 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndvx\" (UniqueName: \"kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx\") pod \"mariadb-client-2-default\" (UID: \"46e407a9-aaf6-4f2a-b67c-10693e731b62\") " pod="openstack/mariadb-client-2-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.731611 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 08:41:07 crc kubenswrapper[4810]: I1003 08:41:07.997546 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 08:41:08 crc kubenswrapper[4810]: W1003 08:41:08.000650 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46e407a9_aaf6_4f2a_b67c_10693e731b62.slice/crio-e1de6c0edf987de654f59d4cf9494b563b2b41fce2fd26de4c4c627eabf5f141 WatchSource:0}: Error finding container e1de6c0edf987de654f59d4cf9494b563b2b41fce2fd26de4c4c627eabf5f141: Status 404 returned error can't find the container with id e1de6c0edf987de654f59d4cf9494b563b2b41fce2fd26de4c4c627eabf5f141 Oct 03 08:41:08 crc kubenswrapper[4810]: I1003 08:41:08.618919 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"46e407a9-aaf6-4f2a-b67c-10693e731b62","Type":"ContainerStarted","Data":"feaa23a1fc232816a473aa97963ece2732d9ddbe40446cbbbd546ca9c9e5351e"} Oct 03 08:41:08 crc kubenswrapper[4810]: I1003 08:41:08.618971 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"46e407a9-aaf6-4f2a-b67c-10693e731b62","Type":"ContainerStarted","Data":"e1de6c0edf987de654f59d4cf9494b563b2b41fce2fd26de4c4c627eabf5f141"} Oct 03 08:41:08 crc kubenswrapper[4810]: I1003 08:41:08.640176 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.640146911 podStartE2EDuration="1.640146911s" podCreationTimestamp="2025-10-03 08:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:08.634884871 +0000 UTC m=+6302.062135636" watchObservedRunningTime="2025-10-03 08:41:08.640146911 +0000 UTC m=+6302.067397666" Oct 03 08:41:08 crc kubenswrapper[4810]: I1003 08:41:08.727589 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_46e407a9-aaf6-4f2a-b67c-10693e731b62/mariadb-client-2-default/0.log" Oct 03 08:41:09 crc kubenswrapper[4810]: I1003 08:41:09.640210 4810 generic.go:334] "Generic (PLEG): container finished" podID="46e407a9-aaf6-4f2a-b67c-10693e731b62" containerID="feaa23a1fc232816a473aa97963ece2732d9ddbe40446cbbbd546ca9c9e5351e" exitCode=0 Oct 03 08:41:09 crc kubenswrapper[4810]: I1003 08:41:09.640272 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"46e407a9-aaf6-4f2a-b67c-10693e731b62","Type":"ContainerDied","Data":"feaa23a1fc232816a473aa97963ece2732d9ddbe40446cbbbd546ca9c9e5351e"} Oct 03 08:41:10 crc kubenswrapper[4810]: E1003 08:41:10.558154 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.067640 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.108288 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.117196 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.206698 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndvx\" (UniqueName: \"kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx\") pod \"46e407a9-aaf6-4f2a-b67c-10693e731b62\" (UID: \"46e407a9-aaf6-4f2a-b67c-10693e731b62\") " Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.217974 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx" (OuterVolumeSpecName: "kube-api-access-jndvx") pod "46e407a9-aaf6-4f2a-b67c-10693e731b62" (UID: "46e407a9-aaf6-4f2a-b67c-10693e731b62"). InnerVolumeSpecName "kube-api-access-jndvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.238031 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-3-default"] Oct 03 08:41:11 crc kubenswrapper[4810]: E1003 08:41:11.238471 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e407a9-aaf6-4f2a-b67c-10693e731b62" containerName="mariadb-client-2-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.238550 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e407a9-aaf6-4f2a-b67c-10693e731b62" containerName="mariadb-client-2-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.238733 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e407a9-aaf6-4f2a-b67c-10693e731b62" containerName="mariadb-client-2-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.239463 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.246351 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.309862 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5lql\" (UniqueName: \"kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql\") pod \"mariadb-client-3-default\" (UID: \"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde\") " pod="openstack/mariadb-client-3-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.310648 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jndvx\" (UniqueName: \"kubernetes.io/projected/46e407a9-aaf6-4f2a-b67c-10693e731b62-kube-api-access-jndvx\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.316486 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e407a9-aaf6-4f2a-b67c-10693e731b62" path="/var/lib/kubelet/pods/46e407a9-aaf6-4f2a-b67c-10693e731b62/volumes" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.412340 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5lql\" (UniqueName: \"kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql\") pod \"mariadb-client-3-default\" (UID: \"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde\") " pod="openstack/mariadb-client-3-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.433451 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5lql\" (UniqueName: \"kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql\") pod \"mariadb-client-3-default\" (UID: \"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde\") " pod="openstack/mariadb-client-3-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.578326 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.660333 4810 scope.go:117] "RemoveContainer" containerID="feaa23a1fc232816a473aa97963ece2732d9ddbe40446cbbbd546ca9c9e5351e" Oct 03 08:41:11 crc kubenswrapper[4810]: I1003 08:41:11.660459 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 03 08:41:12 crc kubenswrapper[4810]: I1003 08:41:12.129968 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-3-default"] Oct 03 08:41:12 crc kubenswrapper[4810]: I1003 08:41:12.670790 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde","Type":"ContainerStarted","Data":"04f58c026417339f08ef2602daba162c0c7e4ef348e1a93ff65f6576f05b604c"} Oct 03 08:41:12 crc kubenswrapper[4810]: I1003 08:41:12.670835 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde","Type":"ContainerStarted","Data":"942d1df38b8d7a3bf0b82abd7933618e89de693a03c395c709d5f7db978091f7"} Oct 03 08:41:12 crc kubenswrapper[4810]: I1003 08:41:12.690729 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-3-default" podStartSLOduration=1.690709094 podStartE2EDuration="1.690709094s" podCreationTimestamp="2025-10-03 08:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:12.686151312 +0000 UTC m=+6306.113402057" watchObservedRunningTime="2025-10-03 08:41:12.690709094 +0000 UTC m=+6306.117959829" Oct 03 08:41:14 crc kubenswrapper[4810]: I1003 08:41:14.691498 4810 generic.go:334] "Generic (PLEG): container finished" podID="f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" containerID="04f58c026417339f08ef2602daba162c0c7e4ef348e1a93ff65f6576f05b604c" exitCode=0 Oct 03 08:41:14 crc kubenswrapper[4810]: I1003 08:41:14.691617 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-3-default" event={"ID":"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde","Type":"ContainerDied","Data":"04f58c026417339f08ef2602daba162c0c7e4ef348e1a93ff65f6576f05b604c"} Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.140481 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.180444 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-3-default"] Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.188126 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-3-default"] Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.204182 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5lql\" (UniqueName: \"kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql\") pod \"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde\" (UID: \"f2e0e624-860d-4df2-ba2d-30ebdfd0dfde\") " Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.210277 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql" (OuterVolumeSpecName: "kube-api-access-v5lql") pod "f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" (UID: "f2e0e624-860d-4df2-ba2d-30ebdfd0dfde"). InnerVolumeSpecName "kube-api-access-v5lql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.306065 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5lql\" (UniqueName: \"kubernetes.io/projected/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde-kube-api-access-v5lql\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.702925 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 03 08:41:16 crc kubenswrapper[4810]: E1003 08:41:16.703293 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" containerName="mariadb-client-3-default" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.703315 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" containerName="mariadb-client-3-default" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.703502 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" containerName="mariadb-client-3-default" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.704158 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.710765 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.713061 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jb5\" (UniqueName: \"kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5\") pod \"mariadb-client-1\" (UID: \"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1\") " pod="openstack/mariadb-client-1" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.720267 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="942d1df38b8d7a3bf0b82abd7933618e89de693a03c395c709d5f7db978091f7" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.720367 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-3-default" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.814925 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jb5\" (UniqueName: \"kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5\") pod \"mariadb-client-1\" (UID: \"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1\") " pod="openstack/mariadb-client-1" Oct 03 08:41:16 crc kubenswrapper[4810]: I1003 08:41:16.844133 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jb5\" (UniqueName: \"kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5\") pod \"mariadb-client-1\" (UID: \"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1\") " pod="openstack/mariadb-client-1" Oct 03 08:41:17 crc kubenswrapper[4810]: I1003 08:41:17.029659 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 08:41:17 crc kubenswrapper[4810]: I1003 08:41:17.318856 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e0e624-860d-4df2-ba2d-30ebdfd0dfde" path="/var/lib/kubelet/pods/f2e0e624-860d-4df2-ba2d-30ebdfd0dfde/volumes" Oct 03 08:41:17 crc kubenswrapper[4810]: I1003 08:41:17.597943 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 08:41:17 crc kubenswrapper[4810]: W1003 08:41:17.606147 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac6fa7cb_3ac9_435c_a5c2_9aa7e37c89e1.slice/crio-99f8445f67076b0ff711a0ac84cc122e83f082019462258b29d67b9c59f2d85c WatchSource:0}: Error finding container 99f8445f67076b0ff711a0ac84cc122e83f082019462258b29d67b9c59f2d85c: Status 404 returned error can't find the container with id 99f8445f67076b0ff711a0ac84cc122e83f082019462258b29d67b9c59f2d85c Oct 03 08:41:17 crc kubenswrapper[4810]: I1003 08:41:17.728722 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1","Type":"ContainerStarted","Data":"99f8445f67076b0ff711a0ac84cc122e83f082019462258b29d67b9c59f2d85c"} Oct 03 08:41:18 crc kubenswrapper[4810]: I1003 08:41:18.741032 4810 generic.go:334] "Generic (PLEG): container finished" podID="ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" containerID="9e57577f352e5807737edf482787b45158c98b78896528f122a76e4d58670993" exitCode=0 Oct 03 08:41:18 crc kubenswrapper[4810]: I1003 08:41:18.741088 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1","Type":"ContainerDied","Data":"9e57577f352e5807737edf482787b45158c98b78896528f122a76e4d58670993"} Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.126584 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.145822 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1/mariadb-client-1/0.log" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.176753 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.181074 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94jb5\" (UniqueName: \"kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5\") pod \"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1\" (UID: \"ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1\") " Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.185081 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.187716 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5" (OuterVolumeSpecName: "kube-api-access-94jb5") pod "ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" (UID: "ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1"). InnerVolumeSpecName "kube-api-access-94jb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.282544 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94jb5\" (UniqueName: \"kubernetes.io/projected/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1-kube-api-access-94jb5\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.622635 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 08:41:20 crc kubenswrapper[4810]: E1003 08:41:20.623266 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" containerName="mariadb-client-1" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.623287 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" containerName="mariadb-client-1" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.623543 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" containerName="mariadb-client-1" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.624358 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.634736 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.687721 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfh2\" (UniqueName: \"kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2\") pod \"mariadb-client-4-default\" (UID: \"69274064-010e-48f7-9ea7-cbe6b969440b\") " pod="openstack/mariadb-client-4-default" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.763849 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f8445f67076b0ff711a0ac84cc122e83f082019462258b29d67b9c59f2d85c" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.763976 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.790275 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfh2\" (UniqueName: \"kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2\") pod \"mariadb-client-4-default\" (UID: \"69274064-010e-48f7-9ea7-cbe6b969440b\") " pod="openstack/mariadb-client-4-default" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.818308 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfh2\" (UniqueName: \"kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2\") pod \"mariadb-client-4-default\" (UID: \"69274064-010e-48f7-9ea7-cbe6b969440b\") " pod="openstack/mariadb-client-4-default" Oct 03 08:41:20 crc kubenswrapper[4810]: E1003 08:41:20.818474 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache]" Oct 03 08:41:20 crc kubenswrapper[4810]: I1003 08:41:20.961128 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 08:41:21 crc kubenswrapper[4810]: I1003 08:41:21.299565 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 08:41:21 crc kubenswrapper[4810]: I1003 08:41:21.324013 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1" path="/var/lib/kubelet/pods/ac6fa7cb-3ac9-435c-a5c2-9aa7e37c89e1/volumes" Oct 03 08:41:21 crc kubenswrapper[4810]: I1003 08:41:21.771264 4810 generic.go:334] "Generic (PLEG): container finished" podID="69274064-010e-48f7-9ea7-cbe6b969440b" containerID="57f370b3b70a0be683b2973a1b147cc55d1e1cf2243cc0f1ddf8ef2201f722f0" exitCode=0 Oct 03 08:41:21 crc kubenswrapper[4810]: I1003 08:41:21.771323 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"69274064-010e-48f7-9ea7-cbe6b969440b","Type":"ContainerDied","Data":"57f370b3b70a0be683b2973a1b147cc55d1e1cf2243cc0f1ddf8ef2201f722f0"} Oct 03 08:41:21 crc kubenswrapper[4810]: I1003 08:41:21.771377 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"69274064-010e-48f7-9ea7-cbe6b969440b","Type":"ContainerStarted","Data":"5d5ce35a70e0193b9d822f2bdd439d84f46504139d065241aec6d2105ef20bbb"} Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.118019 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.136339 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_69274064-010e-48f7-9ea7-cbe6b969440b/mariadb-client-4-default/0.log" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.165740 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.174666 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.227265 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfh2\" (UniqueName: \"kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2\") pod \"69274064-010e-48f7-9ea7-cbe6b969440b\" (UID: \"69274064-010e-48f7-9ea7-cbe6b969440b\") " Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.233080 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2" (OuterVolumeSpecName: "kube-api-access-ngfh2") pod "69274064-010e-48f7-9ea7-cbe6b969440b" (UID: "69274064-010e-48f7-9ea7-cbe6b969440b"). InnerVolumeSpecName "kube-api-access-ngfh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.316928 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69274064-010e-48f7-9ea7-cbe6b969440b" path="/var/lib/kubelet/pods/69274064-010e-48f7-9ea7-cbe6b969440b/volumes" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.330315 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfh2\" (UniqueName: \"kubernetes.io/projected/69274064-010e-48f7-9ea7-cbe6b969440b-kube-api-access-ngfh2\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.803677 4810 scope.go:117] "RemoveContainer" containerID="57f370b3b70a0be683b2973a1b147cc55d1e1cf2243cc0f1ddf8ef2201f722f0" Oct 03 08:41:23 crc kubenswrapper[4810]: I1003 08:41:23.803699 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.491010 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 08:41:27 crc kubenswrapper[4810]: E1003 08:41:27.492573 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69274064-010e-48f7-9ea7-cbe6b969440b" containerName="mariadb-client-4-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.492595 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="69274064-010e-48f7-9ea7-cbe6b969440b" containerName="mariadb-client-4-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.492858 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="69274064-010e-48f7-9ea7-cbe6b969440b" containerName="mariadb-client-4-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.494331 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.498207 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z9qrt" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.498755 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.600472 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq69k\" (UniqueName: \"kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k\") pod \"mariadb-client-5-default\" (UID: \"e9a97ac4-aff8-49d1-9afd-005b019f5a36\") " pod="openstack/mariadb-client-5-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.703136 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq69k\" (UniqueName: \"kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k\") pod \"mariadb-client-5-default\" (UID: \"e9a97ac4-aff8-49d1-9afd-005b019f5a36\") " pod="openstack/mariadb-client-5-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.723943 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq69k\" (UniqueName: \"kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k\") pod \"mariadb-client-5-default\" (UID: \"e9a97ac4-aff8-49d1-9afd-005b019f5a36\") " pod="openstack/mariadb-client-5-default" Oct 03 08:41:27 crc kubenswrapper[4810]: I1003 08:41:27.829940 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 08:41:28 crc kubenswrapper[4810]: I1003 08:41:28.374025 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 08:41:28 crc kubenswrapper[4810]: I1003 08:41:28.852466 4810 generic.go:334] "Generic (PLEG): container finished" podID="e9a97ac4-aff8-49d1-9afd-005b019f5a36" containerID="764c038f1675e1ff8b7d576ca1f31dd45f27040d4c79e530ca432a73d0950d5e" exitCode=0 Oct 03 08:41:28 crc kubenswrapper[4810]: I1003 08:41:28.852532 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"e9a97ac4-aff8-49d1-9afd-005b019f5a36","Type":"ContainerDied","Data":"764c038f1675e1ff8b7d576ca1f31dd45f27040d4c79e530ca432a73d0950d5e"} Oct 03 08:41:28 crc kubenswrapper[4810]: I1003 08:41:28.852766 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"e9a97ac4-aff8-49d1-9afd-005b019f5a36","Type":"ContainerStarted","Data":"f6e75255aa0aba5f420cec5d699499d7922dd3c11d1198b6b7a50afb4c015be0"} Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.234607 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.255338 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_e9a97ac4-aff8-49d1-9afd-005b019f5a36/mariadb-client-5-default/0.log" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.276476 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.292770 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.350741 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq69k\" (UniqueName: \"kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k\") pod \"e9a97ac4-aff8-49d1-9afd-005b019f5a36\" (UID: \"e9a97ac4-aff8-49d1-9afd-005b019f5a36\") " Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.357650 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k" (OuterVolumeSpecName: "kube-api-access-rq69k") pod "e9a97ac4-aff8-49d1-9afd-005b019f5a36" (UID: "e9a97ac4-aff8-49d1-9afd-005b019f5a36"). InnerVolumeSpecName "kube-api-access-rq69k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.452854 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 08:41:30 crc kubenswrapper[4810]: E1003 08:41:30.453295 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a97ac4-aff8-49d1-9afd-005b019f5a36" containerName="mariadb-client-5-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.453318 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a97ac4-aff8-49d1-9afd-005b019f5a36" containerName="mariadb-client-5-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.453535 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a97ac4-aff8-49d1-9afd-005b019f5a36" containerName="mariadb-client-5-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.453545 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq69k\" (UniqueName: \"kubernetes.io/projected/e9a97ac4-aff8-49d1-9afd-005b019f5a36-kube-api-access-rq69k\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.454997 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.467686 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.555015 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tw77\" (UniqueName: \"kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77\") pod \"mariadb-client-6-default\" (UID: \"5a9611d1-61b0-46ef-adfe-9430ec3e52cc\") " pod="openstack/mariadb-client-6-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.656104 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tw77\" (UniqueName: \"kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77\") pod \"mariadb-client-6-default\" (UID: \"5a9611d1-61b0-46ef-adfe-9430ec3e52cc\") " pod="openstack/mariadb-client-6-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.674741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tw77\" (UniqueName: \"kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77\") pod \"mariadb-client-6-default\" (UID: \"5a9611d1-61b0-46ef-adfe-9430ec3e52cc\") " pod="openstack/mariadb-client-6-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.779014 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.880721 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e75255aa0aba5f420cec5d699499d7922dd3c11d1198b6b7a50afb4c015be0" Oct 03 08:41:30 crc kubenswrapper[4810]: I1003 08:41:30.880846 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 03 08:41:31 crc kubenswrapper[4810]: E1003 08:41:31.077668 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9a97ac4_aff8_49d1_9afd_005b019f5a36.slice/crio-f6e75255aa0aba5f420cec5d699499d7922dd3c11d1198b6b7a50afb4c015be0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9a97ac4_aff8_49d1_9afd_005b019f5a36.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.311125 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9a97ac4-aff8-49d1-9afd-005b019f5a36" path="/var/lib/kubelet/pods/e9a97ac4-aff8-49d1-9afd-005b019f5a36/volumes" Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.312133 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 08:41:31 crc kubenswrapper[4810]: W1003 08:41:31.313431 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a9611d1_61b0_46ef_adfe_9430ec3e52cc.slice/crio-750b86be6b556293de4685ff9c9d3aeb6c12b8d822a9861fc8941f98e766cf32 WatchSource:0}: Error finding container 750b86be6b556293de4685ff9c9d3aeb6c12b8d822a9861fc8941f98e766cf32: Status 404 returned error can't find the container with id 750b86be6b556293de4685ff9c9d3aeb6c12b8d822a9861fc8941f98e766cf32 Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.890135 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5a9611d1-61b0-46ef-adfe-9430ec3e52cc","Type":"ContainerStarted","Data":"d84a4fd0a7d2f0b720b3304ceea9a8c9484eb6bf73693aacfc5813244cbdd3f0"} Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.890192 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5a9611d1-61b0-46ef-adfe-9430ec3e52cc","Type":"ContainerStarted","Data":"750b86be6b556293de4685ff9c9d3aeb6c12b8d822a9861fc8941f98e766cf32"} Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.908022 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.907996105 podStartE2EDuration="1.907996105s" podCreationTimestamp="2025-10-03 08:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:31.90255469 +0000 UTC m=+6325.329805425" watchObservedRunningTime="2025-10-03 08:41:31.907996105 +0000 UTC m=+6325.335246850" Oct 03 08:41:31 crc kubenswrapper[4810]: I1003 08:41:31.987816 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_5a9611d1-61b0-46ef-adfe-9430ec3e52cc/mariadb-client-6-default/0.log" Oct 03 08:41:32 crc kubenswrapper[4810]: I1003 08:41:32.088612 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:41:32 crc kubenswrapper[4810]: I1003 08:41:32.088686 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:41:32 crc kubenswrapper[4810]: I1003 08:41:32.900730 4810 generic.go:334] "Generic (PLEG): container finished" podID="5a9611d1-61b0-46ef-adfe-9430ec3e52cc" containerID="d84a4fd0a7d2f0b720b3304ceea9a8c9484eb6bf73693aacfc5813244cbdd3f0" exitCode=0 Oct 03 08:41:32 crc kubenswrapper[4810]: I1003 08:41:32.900788 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5a9611d1-61b0-46ef-adfe-9430ec3e52cc","Type":"ContainerDied","Data":"d84a4fd0a7d2f0b720b3304ceea9a8c9484eb6bf73693aacfc5813244cbdd3f0"} Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.281932 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.319616 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.335831 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.413885 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tw77\" (UniqueName: \"kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77\") pod \"5a9611d1-61b0-46ef-adfe-9430ec3e52cc\" (UID: \"5a9611d1-61b0-46ef-adfe-9430ec3e52cc\") " Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.421145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77" (OuterVolumeSpecName: "kube-api-access-7tw77") pod "5a9611d1-61b0-46ef-adfe-9430ec3e52cc" (UID: "5a9611d1-61b0-46ef-adfe-9430ec3e52cc"). InnerVolumeSpecName "kube-api-access-7tw77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.463446 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 08:41:34 crc kubenswrapper[4810]: E1003 08:41:34.463747 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9611d1-61b0-46ef-adfe-9430ec3e52cc" containerName="mariadb-client-6-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.463763 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9611d1-61b0-46ef-adfe-9430ec3e52cc" containerName="mariadb-client-6-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.463930 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9611d1-61b0-46ef-adfe-9430ec3e52cc" containerName="mariadb-client-6-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.464429 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.469516 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.515940 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tw77\" (UniqueName: \"kubernetes.io/projected/5a9611d1-61b0-46ef-adfe-9430ec3e52cc-kube-api-access-7tw77\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.617951 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qdcp\" (UniqueName: \"kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp\") pod \"mariadb-client-7-default\" (UID: \"cf060450-130c-4a92-babb-6cc958af051c\") " pod="openstack/mariadb-client-7-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.719488 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qdcp\" (UniqueName: \"kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp\") pod \"mariadb-client-7-default\" (UID: \"cf060450-130c-4a92-babb-6cc958af051c\") " pod="openstack/mariadb-client-7-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.742309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qdcp\" (UniqueName: \"kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp\") pod \"mariadb-client-7-default\" (UID: \"cf060450-130c-4a92-babb-6cc958af051c\") " pod="openstack/mariadb-client-7-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.779944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.916516 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750b86be6b556293de4685ff9c9d3aeb6c12b8d822a9861fc8941f98e766cf32" Oct 03 08:41:34 crc kubenswrapper[4810]: I1003 08:41:34.916573 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 03 08:41:35 crc kubenswrapper[4810]: I1003 08:41:35.255550 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 08:41:35 crc kubenswrapper[4810]: W1003 08:41:35.265264 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf060450_130c_4a92_babb_6cc958af051c.slice/crio-225934603e991d5eedfead248deba2ba76066cffd1ccbbab2bab106487999ec8 WatchSource:0}: Error finding container 225934603e991d5eedfead248deba2ba76066cffd1ccbbab2bab106487999ec8: Status 404 returned error can't find the container with id 225934603e991d5eedfead248deba2ba76066cffd1ccbbab2bab106487999ec8 Oct 03 08:41:35 crc kubenswrapper[4810]: I1003 08:41:35.311196 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9611d1-61b0-46ef-adfe-9430ec3e52cc" path="/var/lib/kubelet/pods/5a9611d1-61b0-46ef-adfe-9430ec3e52cc/volumes" Oct 03 08:41:35 crc kubenswrapper[4810]: I1003 08:41:35.925317 4810 generic.go:334] "Generic (PLEG): container finished" podID="cf060450-130c-4a92-babb-6cc958af051c" containerID="b37be6fcd7a63c6d3c11da956270f616d48e6352c0a7d430bb49d3f6ae84d558" exitCode=0 Oct 03 08:41:35 crc kubenswrapper[4810]: I1003 08:41:35.925360 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"cf060450-130c-4a92-babb-6cc958af051c","Type":"ContainerDied","Data":"b37be6fcd7a63c6d3c11da956270f616d48e6352c0a7d430bb49d3f6ae84d558"} Oct 03 08:41:35 crc kubenswrapper[4810]: I1003 08:41:35.925389 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"cf060450-130c-4a92-babb-6cc958af051c","Type":"ContainerStarted","Data":"225934603e991d5eedfead248deba2ba76066cffd1ccbbab2bab106487999ec8"} Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.295503 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.315377 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_cf060450-130c-4a92-babb-6cc958af051c/mariadb-client-7-default/0.log" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.346281 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.352982 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.463402 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qdcp\" (UniqueName: \"kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp\") pod \"cf060450-130c-4a92-babb-6cc958af051c\" (UID: \"cf060450-130c-4a92-babb-6cc958af051c\") " Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.469753 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp" (OuterVolumeSpecName: "kube-api-access-8qdcp") pod "cf060450-130c-4a92-babb-6cc958af051c" (UID: "cf060450-130c-4a92-babb-6cc958af051c"). InnerVolumeSpecName "kube-api-access-8qdcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.506945 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 03 08:41:37 crc kubenswrapper[4810]: E1003 08:41:37.507507 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf060450-130c-4a92-babb-6cc958af051c" containerName="mariadb-client-7-default" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.507545 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf060450-130c-4a92-babb-6cc958af051c" containerName="mariadb-client-7-default" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.508164 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf060450-130c-4a92-babb-6cc958af051c" containerName="mariadb-client-7-default" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.509265 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.517786 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.566559 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qdcp\" (UniqueName: \"kubernetes.io/projected/cf060450-130c-4a92-babb-6cc958af051c-kube-api-access-8qdcp\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.667775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xgv7\" (UniqueName: \"kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7\") pod \"mariadb-client-2\" (UID: \"819166b0-b866-451f-bb80-6bf14027e129\") " pod="openstack/mariadb-client-2" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.769373 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xgv7\" (UniqueName: \"kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7\") pod \"mariadb-client-2\" (UID: \"819166b0-b866-451f-bb80-6bf14027e129\") " pod="openstack/mariadb-client-2" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.786854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xgv7\" (UniqueName: \"kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7\") pod \"mariadb-client-2\" (UID: \"819166b0-b866-451f-bb80-6bf14027e129\") " pod="openstack/mariadb-client-2" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.829669 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.945625 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225934603e991d5eedfead248deba2ba76066cffd1ccbbab2bab106487999ec8" Oct 03 08:41:37 crc kubenswrapper[4810]: I1003 08:41:37.945696 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 03 08:41:38 crc kubenswrapper[4810]: I1003 08:41:38.324595 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 08:41:38 crc kubenswrapper[4810]: W1003 08:41:38.327191 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod819166b0_b866_451f_bb80_6bf14027e129.slice/crio-decfbe280ca03caaeaa9414c52d5bbc6387e18eabee29357a5837d4e31ce9704 WatchSource:0}: Error finding container decfbe280ca03caaeaa9414c52d5bbc6387e18eabee29357a5837d4e31ce9704: Status 404 returned error can't find the container with id decfbe280ca03caaeaa9414c52d5bbc6387e18eabee29357a5837d4e31ce9704 Oct 03 08:41:38 crc kubenswrapper[4810]: I1003 08:41:38.956504 4810 generic.go:334] "Generic (PLEG): container finished" podID="819166b0-b866-451f-bb80-6bf14027e129" containerID="dad0594dc6b52a8f39d395fe55bdcf2426b48917971c61642f486cbb09c330b2" exitCode=0 Oct 03 08:41:38 crc kubenswrapper[4810]: I1003 08:41:38.956724 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"819166b0-b866-451f-bb80-6bf14027e129","Type":"ContainerDied","Data":"dad0594dc6b52a8f39d395fe55bdcf2426b48917971c61642f486cbb09c330b2"} Oct 03 08:41:38 crc kubenswrapper[4810]: I1003 08:41:38.956875 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"819166b0-b866-451f-bb80-6bf14027e129","Type":"ContainerStarted","Data":"decfbe280ca03caaeaa9414c52d5bbc6387e18eabee29357a5837d4e31ce9704"} Oct 03 08:41:39 crc kubenswrapper[4810]: I1003 08:41:39.313311 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf060450-130c-4a92-babb-6cc958af051c" path="/var/lib/kubelet/pods/cf060450-130c-4a92-babb-6cc958af051c/volumes" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.288039 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.311332 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_819166b0-b866-451f-bb80-6bf14027e129/mariadb-client-2/0.log" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.340571 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.346061 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.410863 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xgv7\" (UniqueName: \"kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7\") pod \"819166b0-b866-451f-bb80-6bf14027e129\" (UID: \"819166b0-b866-451f-bb80-6bf14027e129\") " Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.420284 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7" (OuterVolumeSpecName: "kube-api-access-4xgv7") pod "819166b0-b866-451f-bb80-6bf14027e129" (UID: "819166b0-b866-451f-bb80-6bf14027e129"). InnerVolumeSpecName "kube-api-access-4xgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.513296 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xgv7\" (UniqueName: \"kubernetes.io/projected/819166b0-b866-451f-bb80-6bf14027e129-kube-api-access-4xgv7\") on node \"crc\" DevicePath \"\"" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.985830 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="decfbe280ca03caaeaa9414c52d5bbc6387e18eabee29357a5837d4e31ce9704" Oct 03 08:41:40 crc kubenswrapper[4810]: I1003 08:41:40.985946 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 03 08:41:41 crc kubenswrapper[4810]: E1003 08:41:41.278322 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache]" Oct 03 08:41:41 crc kubenswrapper[4810]: I1003 08:41:41.311014 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="819166b0-b866-451f-bb80-6bf14027e129" path="/var/lib/kubelet/pods/819166b0-b866-451f-bb80-6bf14027e129/volumes" Oct 03 08:41:51 crc kubenswrapper[4810]: E1003 08:41:51.477371 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:42:01 crc kubenswrapper[4810]: E1003 08:42:01.692509 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice/crio-4bd2d99f2f80a5214524d2c668308d11e87c86caf084209c9e57d22122d970cc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63326aeb_5c28_47e3_b502_c07b522e240d.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:42:02 crc kubenswrapper[4810]: I1003 08:42:02.089444 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:42:02 crc kubenswrapper[4810]: I1003 08:42:02.089507 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:42:15 crc kubenswrapper[4810]: I1003 08:42:15.513714 4810 scope.go:117] "RemoveContainer" containerID="eef72c9d2ea64b267c5d3fe8f64d415b95dbf0904b3fbca6a49cedafc798d606" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.089170 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.089796 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.089848 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.090619 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.090674 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" gracePeriod=600 Oct 03 08:42:32 crc kubenswrapper[4810]: E1003 08:42:32.223791 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.431464 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" exitCode=0 Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.431507 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84"} Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.431540 4810 scope.go:117] "RemoveContainer" containerID="c7e0d134ee15e1b9f936849d0ad82e5fb1c94a4ce0140cc4dfdbe719d0677326" Oct 03 08:42:32 crc kubenswrapper[4810]: I1003 08:42:32.432114 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:42:32 crc kubenswrapper[4810]: E1003 08:42:32.432379 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:42:44 crc kubenswrapper[4810]: I1003 08:42:44.302090 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:42:44 crc kubenswrapper[4810]: E1003 08:42:44.303076 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:42:55 crc kubenswrapper[4810]: I1003 08:42:55.302113 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:42:55 crc kubenswrapper[4810]: E1003 08:42:55.302671 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:43:08 crc kubenswrapper[4810]: I1003 08:43:08.303204 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:43:08 crc kubenswrapper[4810]: E1003 08:43:08.303842 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:43:22 crc kubenswrapper[4810]: I1003 08:43:22.302750 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:43:22 crc kubenswrapper[4810]: E1003 08:43:22.304315 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:43:34 crc kubenswrapper[4810]: I1003 08:43:34.304546 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:43:34 crc kubenswrapper[4810]: E1003 08:43:34.307508 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:43:46 crc kubenswrapper[4810]: I1003 08:43:46.302256 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:43:46 crc kubenswrapper[4810]: E1003 08:43:46.304128 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:43:58 crc kubenswrapper[4810]: I1003 08:43:58.302620 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:43:58 crc kubenswrapper[4810]: E1003 08:43:58.303834 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:44:11 crc kubenswrapper[4810]: I1003 08:44:11.303917 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:44:11 crc kubenswrapper[4810]: E1003 08:44:11.304739 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:44:22 crc kubenswrapper[4810]: I1003 08:44:22.302656 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:44:22 crc kubenswrapper[4810]: E1003 08:44:22.303294 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.536285 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:26 crc kubenswrapper[4810]: E1003 08:44:26.537350 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="819166b0-b866-451f-bb80-6bf14027e129" containerName="mariadb-client-2" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.537375 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="819166b0-b866-451f-bb80-6bf14027e129" containerName="mariadb-client-2" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.537705 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="819166b0-b866-451f-bb80-6bf14027e129" containerName="mariadb-client-2" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.539779 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.551796 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.626395 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.626630 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjr5\" (UniqueName: \"kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.626968 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.729355 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjr5\" (UniqueName: \"kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.729718 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.729785 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.730731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.730850 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.749553 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjr5\" (UniqueName: \"kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5\") pod \"redhat-marketplace-zknq5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:26 crc kubenswrapper[4810]: I1003 08:44:26.877477 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:27 crc kubenswrapper[4810]: I1003 08:44:27.313769 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:27 crc kubenswrapper[4810]: I1003 08:44:27.434342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerStarted","Data":"fd063e0fcd48451e7708cf048f7cc6b5ed68479b26167c6275201a9d1400f7da"} Oct 03 08:44:28 crc kubenswrapper[4810]: I1003 08:44:28.446569 4810 generic.go:334] "Generic (PLEG): container finished" podID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerID="861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f" exitCode=0 Oct 03 08:44:28 crc kubenswrapper[4810]: I1003 08:44:28.446637 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerDied","Data":"861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f"} Oct 03 08:44:28 crc kubenswrapper[4810]: I1003 08:44:28.450828 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:44:30 crc kubenswrapper[4810]: I1003 08:44:30.467696 4810 generic.go:334] "Generic (PLEG): container finished" podID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerID="2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d" exitCode=0 Oct 03 08:44:30 crc kubenswrapper[4810]: I1003 08:44:30.467998 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerDied","Data":"2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d"} Oct 03 08:44:31 crc kubenswrapper[4810]: I1003 08:44:31.482536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerStarted","Data":"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1"} Oct 03 08:44:31 crc kubenswrapper[4810]: I1003 08:44:31.507198 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zknq5" podStartSLOduration=2.797613106 podStartE2EDuration="5.507171956s" podCreationTimestamp="2025-10-03 08:44:26 +0000 UTC" firstStartedPulling="2025-10-03 08:44:28.450529148 +0000 UTC m=+6501.877779883" lastFinishedPulling="2025-10-03 08:44:31.160087998 +0000 UTC m=+6504.587338733" observedRunningTime="2025-10-03 08:44:31.499860422 +0000 UTC m=+6504.927111167" watchObservedRunningTime="2025-10-03 08:44:31.507171956 +0000 UTC m=+6504.934422691" Oct 03 08:44:34 crc kubenswrapper[4810]: I1003 08:44:34.302587 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:44:34 crc kubenswrapper[4810]: E1003 08:44:34.303359 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:44:36 crc kubenswrapper[4810]: I1003 08:44:36.878252 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:36 crc kubenswrapper[4810]: I1003 08:44:36.878679 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:36 crc kubenswrapper[4810]: I1003 08:44:36.943433 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:37 crc kubenswrapper[4810]: I1003 08:44:37.576776 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:37 crc kubenswrapper[4810]: I1003 08:44:37.629004 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:39 crc kubenswrapper[4810]: I1003 08:44:39.551589 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zknq5" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="registry-server" containerID="cri-o://22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1" gracePeriod=2 Oct 03 08:44:39 crc kubenswrapper[4810]: I1003 08:44:39.960445 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.043884 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqjr5\" (UniqueName: \"kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5\") pod \"59e32694-c74c-47bc-ae86-217fecae7cb5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.043963 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities\") pod \"59e32694-c74c-47bc-ae86-217fecae7cb5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.044057 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content\") pod \"59e32694-c74c-47bc-ae86-217fecae7cb5\" (UID: \"59e32694-c74c-47bc-ae86-217fecae7cb5\") " Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.044860 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities" (OuterVolumeSpecName: "utilities") pod "59e32694-c74c-47bc-ae86-217fecae7cb5" (UID: "59e32694-c74c-47bc-ae86-217fecae7cb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.049912 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5" (OuterVolumeSpecName: "kube-api-access-jqjr5") pod "59e32694-c74c-47bc-ae86-217fecae7cb5" (UID: "59e32694-c74c-47bc-ae86-217fecae7cb5"). InnerVolumeSpecName "kube-api-access-jqjr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.055647 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59e32694-c74c-47bc-ae86-217fecae7cb5" (UID: "59e32694-c74c-47bc-ae86-217fecae7cb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.146255 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.146363 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqjr5\" (UniqueName: \"kubernetes.io/projected/59e32694-c74c-47bc-ae86-217fecae7cb5-kube-api-access-jqjr5\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.146380 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59e32694-c74c-47bc-ae86-217fecae7cb5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.562393 4810 generic.go:334] "Generic (PLEG): container finished" podID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerID="22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1" exitCode=0 Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.562437 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerDied","Data":"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1"} Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.562471 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zknq5" event={"ID":"59e32694-c74c-47bc-ae86-217fecae7cb5","Type":"ContainerDied","Data":"fd063e0fcd48451e7708cf048f7cc6b5ed68479b26167c6275201a9d1400f7da"} Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.562488 4810 scope.go:117] "RemoveContainer" containerID="22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.562579 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zknq5" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.581710 4810 scope.go:117] "RemoveContainer" containerID="2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.608348 4810 scope.go:117] "RemoveContainer" containerID="861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.613125 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.622554 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zknq5"] Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.641393 4810 scope.go:117] "RemoveContainer" containerID="22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1" Oct 03 08:44:40 crc kubenswrapper[4810]: E1003 08:44:40.641784 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1\": container with ID starting with 22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1 not found: ID does not exist" containerID="22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.641814 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1"} err="failed to get container status \"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1\": rpc error: code = NotFound desc = could not find container \"22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1\": container with ID starting with 22e5f6eae99a76495a24e4f7b441d9bb1c81c46f6b7f5250e8c1e0b21cfe5eb1 not found: ID does not exist" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.641833 4810 scope.go:117] "RemoveContainer" containerID="2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d" Oct 03 08:44:40 crc kubenswrapper[4810]: E1003 08:44:40.642192 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d\": container with ID starting with 2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d not found: ID does not exist" containerID="2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.642215 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d"} err="failed to get container status \"2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d\": rpc error: code = NotFound desc = could not find container \"2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d\": container with ID starting with 2784cf9fb387ee4c147c788bf1f88c939a248a3e4810ba61e4db3a132e20937d not found: ID does not exist" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.642229 4810 scope.go:117] "RemoveContainer" containerID="861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f" Oct 03 08:44:40 crc kubenswrapper[4810]: E1003 08:44:40.642491 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f\": container with ID starting with 861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f not found: ID does not exist" containerID="861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f" Oct 03 08:44:40 crc kubenswrapper[4810]: I1003 08:44:40.642516 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f"} err="failed to get container status \"861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f\": rpc error: code = NotFound desc = could not find container \"861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f\": container with ID starting with 861ed3d8420ebacd18d220fe22d6a202fb80bc01aac11d4ac86c4b7ea9e7603f not found: ID does not exist" Oct 03 08:44:41 crc kubenswrapper[4810]: I1003 08:44:41.332402 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" path="/var/lib/kubelet/pods/59e32694-c74c-47bc-ae86-217fecae7cb5/volumes" Oct 03 08:44:49 crc kubenswrapper[4810]: I1003 08:44:49.303798 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:44:49 crc kubenswrapper[4810]: E1003 08:44:49.305074 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.497738 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:44:57 crc kubenswrapper[4810]: E1003 08:44:57.498940 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="registry-server" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.498975 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="registry-server" Oct 03 08:44:57 crc kubenswrapper[4810]: E1003 08:44:57.499001 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="extract-content" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.499012 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="extract-content" Oct 03 08:44:57 crc kubenswrapper[4810]: E1003 08:44:57.499028 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="extract-utilities" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.499037 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="extract-utilities" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.499229 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="59e32694-c74c-47bc-ae86-217fecae7cb5" containerName="registry-server" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.500736 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.514467 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.514555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.514603 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xq7z\" (UniqueName: \"kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.538181 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.617072 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.617142 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.617184 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xq7z\" (UniqueName: \"kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.617750 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.617793 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.638840 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xq7z\" (UniqueName: \"kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z\") pod \"community-operators-kfq8m\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:57 crc kubenswrapper[4810]: I1003 08:44:57.851981 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:44:58 crc kubenswrapper[4810]: I1003 08:44:58.317803 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:44:58 crc kubenswrapper[4810]: W1003 08:44:58.327701 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9cede8_83ae_46a3_95dd_a72e42a008f4.slice/crio-2dc8791ec10509e0e89af174a98a3bca02c4ce2fff15bb277cbf27ad3d4036f2 WatchSource:0}: Error finding container 2dc8791ec10509e0e89af174a98a3bca02c4ce2fff15bb277cbf27ad3d4036f2: Status 404 returned error can't find the container with id 2dc8791ec10509e0e89af174a98a3bca02c4ce2fff15bb277cbf27ad3d4036f2 Oct 03 08:44:58 crc kubenswrapper[4810]: I1003 08:44:58.715958 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerID="4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173" exitCode=0 Oct 03 08:44:58 crc kubenswrapper[4810]: I1003 08:44:58.716051 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerDied","Data":"4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173"} Oct 03 08:44:58 crc kubenswrapper[4810]: I1003 08:44:58.716185 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerStarted","Data":"2dc8791ec10509e0e89af174a98a3bca02c4ce2fff15bb277cbf27ad3d4036f2"} Oct 03 08:44:59 crc kubenswrapper[4810]: I1003 08:44:59.728233 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerStarted","Data":"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae"} Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.176328 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw"] Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.178158 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.181980 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.182258 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw"] Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.182835 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.366343 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.366531 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.366983 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r56pk\" (UniqueName: \"kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.467914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r56pk\" (UniqueName: \"kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.467963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.468000 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.468850 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.481844 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.497804 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r56pk\" (UniqueName: \"kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk\") pod \"collect-profiles-29324685-5jncw\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.500151 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.739649 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerID="4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae" exitCode=0 Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.739708 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerDied","Data":"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae"} Oct 03 08:45:00 crc kubenswrapper[4810]: I1003 08:45:00.927696 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw"] Oct 03 08:45:00 crc kubenswrapper[4810]: W1003 08:45:00.936797 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f8c257a_799f_41ab_b3de_c0385c90ab80.slice/crio-4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9 WatchSource:0}: Error finding container 4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9: Status 404 returned error can't find the container with id 4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9 Oct 03 08:45:01 crc kubenswrapper[4810]: I1003 08:45:01.748630 4810 generic.go:334] "Generic (PLEG): container finished" podID="8f8c257a-799f-41ab-b3de-c0385c90ab80" containerID="002040f6da1b02053b17eeed34f2e9f7fb55ee92ba2677bf26dbcf4f347cd1d1" exitCode=0 Oct 03 08:45:01 crc kubenswrapper[4810]: I1003 08:45:01.748699 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" event={"ID":"8f8c257a-799f-41ab-b3de-c0385c90ab80","Type":"ContainerDied","Data":"002040f6da1b02053b17eeed34f2e9f7fb55ee92ba2677bf26dbcf4f347cd1d1"} Oct 03 08:45:01 crc kubenswrapper[4810]: I1003 08:45:01.748994 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" event={"ID":"8f8c257a-799f-41ab-b3de-c0385c90ab80","Type":"ContainerStarted","Data":"4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9"} Oct 03 08:45:01 crc kubenswrapper[4810]: I1003 08:45:01.751493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerStarted","Data":"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9"} Oct 03 08:45:01 crc kubenswrapper[4810]: I1003 08:45:01.783701 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kfq8m" podStartSLOduration=1.982798458 podStartE2EDuration="4.783683239s" podCreationTimestamp="2025-10-03 08:44:57 +0000 UTC" firstStartedPulling="2025-10-03 08:44:58.717703562 +0000 UTC m=+6532.144954317" lastFinishedPulling="2025-10-03 08:45:01.518588363 +0000 UTC m=+6534.945839098" observedRunningTime="2025-10-03 08:45:01.779743684 +0000 UTC m=+6535.206994419" watchObservedRunningTime="2025-10-03 08:45:01.783683239 +0000 UTC m=+6535.210933974" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.105317 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.209442 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume\") pod \"8f8c257a-799f-41ab-b3de-c0385c90ab80\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.209767 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r56pk\" (UniqueName: \"kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk\") pod \"8f8c257a-799f-41ab-b3de-c0385c90ab80\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.209797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume\") pod \"8f8c257a-799f-41ab-b3de-c0385c90ab80\" (UID: \"8f8c257a-799f-41ab-b3de-c0385c90ab80\") " Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.210730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f8c257a-799f-41ab-b3de-c0385c90ab80" (UID: "8f8c257a-799f-41ab-b3de-c0385c90ab80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.214807 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f8c257a-799f-41ab-b3de-c0385c90ab80" (UID: "8f8c257a-799f-41ab-b3de-c0385c90ab80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.218127 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk" (OuterVolumeSpecName: "kube-api-access-r56pk") pod "8f8c257a-799f-41ab-b3de-c0385c90ab80" (UID: "8f8c257a-799f-41ab-b3de-c0385c90ab80"). InnerVolumeSpecName "kube-api-access-r56pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.302211 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:45:03 crc kubenswrapper[4810]: E1003 08:45:03.302475 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.311801 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r56pk\" (UniqueName: \"kubernetes.io/projected/8f8c257a-799f-41ab-b3de-c0385c90ab80-kube-api-access-r56pk\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.311840 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f8c257a-799f-41ab-b3de-c0385c90ab80-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.311853 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f8c257a-799f-41ab-b3de-c0385c90ab80-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.767796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" event={"ID":"8f8c257a-799f-41ab-b3de-c0385c90ab80","Type":"ContainerDied","Data":"4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9"} Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.767847 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4851114ce110859d0f2b0f9916beb12ca98c95f8b94cc23eb7c3677ccfbdbac9" Oct 03 08:45:03 crc kubenswrapper[4810]: I1003 08:45:03.767931 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-5jncw" Oct 03 08:45:04 crc kubenswrapper[4810]: I1003 08:45:04.210544 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2"] Oct 03 08:45:04 crc kubenswrapper[4810]: I1003 08:45:04.216029 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324640-qkwb2"] Oct 03 08:45:05 crc kubenswrapper[4810]: I1003 08:45:05.312311 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867e3ee1-2322-4e01-af2f-882920acb70d" path="/var/lib/kubelet/pods/867e3ee1-2322-4e01-af2f-882920acb70d/volumes" Oct 03 08:45:07 crc kubenswrapper[4810]: I1003 08:45:07.853006 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:07 crc kubenswrapper[4810]: I1003 08:45:07.853388 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:07 crc kubenswrapper[4810]: I1003 08:45:07.906221 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:08 crc kubenswrapper[4810]: I1003 08:45:08.877439 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:08 crc kubenswrapper[4810]: I1003 08:45:08.940813 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:45:10 crc kubenswrapper[4810]: I1003 08:45:10.821715 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kfq8m" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="registry-server" containerID="cri-o://0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9" gracePeriod=2 Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.212943 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.322228 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities\") pod \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.322301 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content\") pod \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.322323 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xq7z\" (UniqueName: \"kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z\") pod \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\" (UID: \"9a9cede8-83ae-46a3-95dd-a72e42a008f4\") " Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.324019 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities" (OuterVolumeSpecName: "utilities") pod "9a9cede8-83ae-46a3-95dd-a72e42a008f4" (UID: "9a9cede8-83ae-46a3-95dd-a72e42a008f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.328265 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z" (OuterVolumeSpecName: "kube-api-access-5xq7z") pod "9a9cede8-83ae-46a3-95dd-a72e42a008f4" (UID: "9a9cede8-83ae-46a3-95dd-a72e42a008f4"). InnerVolumeSpecName "kube-api-access-5xq7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.376324 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a9cede8-83ae-46a3-95dd-a72e42a008f4" (UID: "9a9cede8-83ae-46a3-95dd-a72e42a008f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.424667 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.424700 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xq7z\" (UniqueName: \"kubernetes.io/projected/9a9cede8-83ae-46a3-95dd-a72e42a008f4-kube-api-access-5xq7z\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.424712 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a9cede8-83ae-46a3-95dd-a72e42a008f4-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.830047 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerID="0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9" exitCode=0 Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.830091 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerDied","Data":"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9"} Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.830122 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kfq8m" event={"ID":"9a9cede8-83ae-46a3-95dd-a72e42a008f4","Type":"ContainerDied","Data":"2dc8791ec10509e0e89af174a98a3bca02c4ce2fff15bb277cbf27ad3d4036f2"} Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.830139 4810 scope.go:117] "RemoveContainer" containerID="0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.830254 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kfq8m" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.853334 4810 scope.go:117] "RemoveContainer" containerID="4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.869387 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.876885 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kfq8m"] Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.890107 4810 scope.go:117] "RemoveContainer" containerID="4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.931168 4810 scope.go:117] "RemoveContainer" containerID="0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9" Oct 03 08:45:11 crc kubenswrapper[4810]: E1003 08:45:11.931718 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9\": container with ID starting with 0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9 not found: ID does not exist" containerID="0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.931762 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9"} err="failed to get container status \"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9\": rpc error: code = NotFound desc = could not find container \"0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9\": container with ID starting with 0c3a361377e847e599596e7153744e508db7e5c6ceee4b44af655b821aa555f9 not found: ID does not exist" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.931789 4810 scope.go:117] "RemoveContainer" containerID="4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae" Oct 03 08:45:11 crc kubenswrapper[4810]: E1003 08:45:11.932321 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae\": container with ID starting with 4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae not found: ID does not exist" containerID="4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.932349 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae"} err="failed to get container status \"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae\": rpc error: code = NotFound desc = could not find container \"4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae\": container with ID starting with 4b5c6b40dadff9f37d3fa45d8094e2180e14146b0f3a7a104e77706ce8aabfae not found: ID does not exist" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.932365 4810 scope.go:117] "RemoveContainer" containerID="4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173" Oct 03 08:45:11 crc kubenswrapper[4810]: E1003 08:45:11.932664 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173\": container with ID starting with 4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173 not found: ID does not exist" containerID="4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173" Oct 03 08:45:11 crc kubenswrapper[4810]: I1003 08:45:11.932703 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173"} err="failed to get container status \"4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173\": rpc error: code = NotFound desc = could not find container \"4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173\": container with ID starting with 4558ac860e44af6684d1b8f90e5afffea3d3865f754265c29cf04b0927d92173 not found: ID does not exist" Oct 03 08:45:13 crc kubenswrapper[4810]: I1003 08:45:13.311541 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" path="/var/lib/kubelet/pods/9a9cede8-83ae-46a3-95dd-a72e42a008f4/volumes" Oct 03 08:45:15 crc kubenswrapper[4810]: I1003 08:45:15.664531 4810 scope.go:117] "RemoveContainer" containerID="1277d005e3686ba6fecbf89587112952bea4383de4b42fded4b4e54798f9463e" Oct 03 08:45:18 crc kubenswrapper[4810]: I1003 08:45:18.302750 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:45:18 crc kubenswrapper[4810]: E1003 08:45:18.303471 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:45:32 crc kubenswrapper[4810]: I1003 08:45:32.301805 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:45:32 crc kubenswrapper[4810]: E1003 08:45:32.302350 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:45:45 crc kubenswrapper[4810]: I1003 08:45:45.302625 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:45:45 crc kubenswrapper[4810]: E1003 08:45:45.303327 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:45:56 crc kubenswrapper[4810]: I1003 08:45:56.303038 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:45:56 crc kubenswrapper[4810]: E1003 08:45:56.304047 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:46:09 crc kubenswrapper[4810]: I1003 08:46:09.303205 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:46:09 crc kubenswrapper[4810]: E1003 08:46:09.305446 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:46:21 crc kubenswrapper[4810]: I1003 08:46:21.302736 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:46:21 crc kubenswrapper[4810]: E1003 08:46:21.303319 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:46:35 crc kubenswrapper[4810]: I1003 08:46:35.303309 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:46:35 crc kubenswrapper[4810]: E1003 08:46:35.304150 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:46:47 crc kubenswrapper[4810]: I1003 08:46:47.321613 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:46:47 crc kubenswrapper[4810]: E1003 08:46:47.322590 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.967522 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:46:48 crc kubenswrapper[4810]: E1003 08:46:48.968575 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8c257a-799f-41ab-b3de-c0385c90ab80" containerName="collect-profiles" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968602 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8c257a-799f-41ab-b3de-c0385c90ab80" containerName="collect-profiles" Oct 03 08:46:48 crc kubenswrapper[4810]: E1003 08:46:48.968630 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="extract-utilities" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968643 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="extract-utilities" Oct 03 08:46:48 crc kubenswrapper[4810]: E1003 08:46:48.968664 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="registry-server" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968671 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="registry-server" Oct 03 08:46:48 crc kubenswrapper[4810]: E1003 08:46:48.968680 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="extract-content" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968687 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="extract-content" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968855 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a9cede8-83ae-46a3-95dd-a72e42a008f4" containerName="registry-server" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.968880 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8c257a-799f-41ab-b3de-c0385c90ab80" containerName="collect-profiles" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.970358 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.977385 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.998412 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.998653 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs8hb\" (UniqueName: \"kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:48 crc kubenswrapper[4810]: I1003 08:46:48.999246 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.100464 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.100534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.100596 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs8hb\" (UniqueName: \"kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.101065 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.101335 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.120278 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs8hb\" (UniqueName: \"kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb\") pod \"certified-operators-lztvf\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.314995 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:49 crc kubenswrapper[4810]: W1003 08:46:49.763598 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26fae5a9_d597_453e_8636_07c2d2e73b4c.slice/crio-75a5c256e08da8ec768b9444d8fba28da89c32631689ab89449f5260773b92d8 WatchSource:0}: Error finding container 75a5c256e08da8ec768b9444d8fba28da89c32631689ab89449f5260773b92d8: Status 404 returned error can't find the container with id 75a5c256e08da8ec768b9444d8fba28da89c32631689ab89449f5260773b92d8 Oct 03 08:46:49 crc kubenswrapper[4810]: I1003 08:46:49.775656 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:46:50 crc kubenswrapper[4810]: I1003 08:46:50.642527 4810 generic.go:334] "Generic (PLEG): container finished" podID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerID="b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68" exitCode=0 Oct 03 08:46:50 crc kubenswrapper[4810]: I1003 08:46:50.642653 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerDied","Data":"b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68"} Oct 03 08:46:50 crc kubenswrapper[4810]: I1003 08:46:50.644231 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerStarted","Data":"75a5c256e08da8ec768b9444d8fba28da89c32631689ab89449f5260773b92d8"} Oct 03 08:46:51 crc kubenswrapper[4810]: I1003 08:46:51.657882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerStarted","Data":"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78"} Oct 03 08:46:52 crc kubenswrapper[4810]: I1003 08:46:52.668226 4810 generic.go:334] "Generic (PLEG): container finished" podID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerID="fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78" exitCode=0 Oct 03 08:46:52 crc kubenswrapper[4810]: I1003 08:46:52.668261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerDied","Data":"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78"} Oct 03 08:46:53 crc kubenswrapper[4810]: I1003 08:46:53.677001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerStarted","Data":"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40"} Oct 03 08:46:53 crc kubenswrapper[4810]: I1003 08:46:53.699083 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lztvf" podStartSLOduration=3.270578398 podStartE2EDuration="5.699053536s" podCreationTimestamp="2025-10-03 08:46:48 +0000 UTC" firstStartedPulling="2025-10-03 08:46:50.644816292 +0000 UTC m=+6644.072067027" lastFinishedPulling="2025-10-03 08:46:53.07329143 +0000 UTC m=+6646.500542165" observedRunningTime="2025-10-03 08:46:53.696498018 +0000 UTC m=+6647.123748753" watchObservedRunningTime="2025-10-03 08:46:53.699053536 +0000 UTC m=+6647.126304281" Oct 03 08:46:59 crc kubenswrapper[4810]: I1003 08:46:59.316023 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:59 crc kubenswrapper[4810]: I1003 08:46:59.316609 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:59 crc kubenswrapper[4810]: I1003 08:46:59.362733 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:59 crc kubenswrapper[4810]: I1003 08:46:59.798523 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:46:59 crc kubenswrapper[4810]: I1003 08:46:59.865497 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:47:00 crc kubenswrapper[4810]: I1003 08:47:00.303374 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:47:00 crc kubenswrapper[4810]: E1003 08:47:00.303782 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:47:01 crc kubenswrapper[4810]: I1003 08:47:01.748050 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lztvf" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="registry-server" containerID="cri-o://158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40" gracePeriod=2 Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.212345 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.405647 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs8hb\" (UniqueName: \"kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb\") pod \"26fae5a9-d597-453e-8636-07c2d2e73b4c\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.406095 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content\") pod \"26fae5a9-d597-453e-8636-07c2d2e73b4c\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.406128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities\") pod \"26fae5a9-d597-453e-8636-07c2d2e73b4c\" (UID: \"26fae5a9-d597-453e-8636-07c2d2e73b4c\") " Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.406942 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities" (OuterVolumeSpecName: "utilities") pod "26fae5a9-d597-453e-8636-07c2d2e73b4c" (UID: "26fae5a9-d597-453e-8636-07c2d2e73b4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.412364 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb" (OuterVolumeSpecName: "kube-api-access-hs8hb") pod "26fae5a9-d597-453e-8636-07c2d2e73b4c" (UID: "26fae5a9-d597-453e-8636-07c2d2e73b4c"). InnerVolumeSpecName "kube-api-access-hs8hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.452128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26fae5a9-d597-453e-8636-07c2d2e73b4c" (UID: "26fae5a9-d597-453e-8636-07c2d2e73b4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.508036 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.508071 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26fae5a9-d597-453e-8636-07c2d2e73b4c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.508082 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs8hb\" (UniqueName: \"kubernetes.io/projected/26fae5a9-d597-453e-8636-07c2d2e73b4c-kube-api-access-hs8hb\") on node \"crc\" DevicePath \"\"" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.758315 4810 generic.go:334] "Generic (PLEG): container finished" podID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerID="158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40" exitCode=0 Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.758381 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerDied","Data":"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40"} Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.758422 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lztvf" event={"ID":"26fae5a9-d597-453e-8636-07c2d2e73b4c","Type":"ContainerDied","Data":"75a5c256e08da8ec768b9444d8fba28da89c32631689ab89449f5260773b92d8"} Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.758456 4810 scope.go:117] "RemoveContainer" containerID="158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.758386 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lztvf" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.781942 4810 scope.go:117] "RemoveContainer" containerID="fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.799958 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.812830 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lztvf"] Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.817459 4810 scope.go:117] "RemoveContainer" containerID="b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.860447 4810 scope.go:117] "RemoveContainer" containerID="158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40" Oct 03 08:47:02 crc kubenswrapper[4810]: E1003 08:47:02.860884 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40\": container with ID starting with 158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40 not found: ID does not exist" containerID="158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.860954 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40"} err="failed to get container status \"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40\": rpc error: code = NotFound desc = could not find container \"158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40\": container with ID starting with 158641894d1345b99e4f68d866b81c8435d8288433b0550d21d8d524925b6e40 not found: ID does not exist" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.860982 4810 scope.go:117] "RemoveContainer" containerID="fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78" Oct 03 08:47:02 crc kubenswrapper[4810]: E1003 08:47:02.861358 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78\": container with ID starting with fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78 not found: ID does not exist" containerID="fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.861417 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78"} err="failed to get container status \"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78\": rpc error: code = NotFound desc = could not find container \"fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78\": container with ID starting with fd05b494ceb871e49189a8ed7081fd84297d1d612ce16a8b0f8f836303305a78 not found: ID does not exist" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.861455 4810 scope.go:117] "RemoveContainer" containerID="b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68" Oct 03 08:47:02 crc kubenswrapper[4810]: E1003 08:47:02.861822 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68\": container with ID starting with b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68 not found: ID does not exist" containerID="b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68" Oct 03 08:47:02 crc kubenswrapper[4810]: I1003 08:47:02.861859 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68"} err="failed to get container status \"b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68\": rpc error: code = NotFound desc = could not find container \"b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68\": container with ID starting with b8f402d418ca9e2e2deb8e74e3077f7d7f2c6d6b4c340dd72ea57ef3a39c9d68 not found: ID does not exist" Oct 03 08:47:03 crc kubenswrapper[4810]: I1003 08:47:03.313954 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" path="/var/lib/kubelet/pods/26fae5a9-d597-453e-8636-07c2d2e73b4c/volumes" Oct 03 08:47:15 crc kubenswrapper[4810]: I1003 08:47:15.303609 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:47:15 crc kubenswrapper[4810]: E1003 08:47:15.304481 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:47:15 crc kubenswrapper[4810]: I1003 08:47:15.784199 4810 scope.go:117] "RemoveContainer" containerID="04f58c026417339f08ef2602daba162c0c7e4ef348e1a93ff65f6576f05b604c" Oct 03 08:47:30 crc kubenswrapper[4810]: I1003 08:47:30.302857 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:47:30 crc kubenswrapper[4810]: E1003 08:47:30.303836 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:47:43 crc kubenswrapper[4810]: I1003 08:47:43.303342 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:47:44 crc kubenswrapper[4810]: I1003 08:47:44.092212 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1"} Oct 03 08:48:15 crc kubenswrapper[4810]: I1003 08:48:15.856201 4810 scope.go:117] "RemoveContainer" containerID="d84a4fd0a7d2f0b720b3304ceea9a8c9484eb6bf73693aacfc5813244cbdd3f0" Oct 03 08:48:15 crc kubenswrapper[4810]: I1003 08:48:15.877734 4810 scope.go:117] "RemoveContainer" containerID="9e57577f352e5807737edf482787b45158c98b78896528f122a76e4d58670993" Oct 03 08:48:15 crc kubenswrapper[4810]: I1003 08:48:15.906799 4810 scope.go:117] "RemoveContainer" containerID="764c038f1675e1ff8b7d576ca1f31dd45f27040d4c79e530ca432a73d0950d5e" Oct 03 08:48:15 crc kubenswrapper[4810]: I1003 08:48:15.940541 4810 scope.go:117] "RemoveContainer" containerID="dad0594dc6b52a8f39d395fe55bdcf2426b48917971c61642f486cbb09c330b2" Oct 03 08:48:16 crc kubenswrapper[4810]: I1003 08:48:16.002516 4810 scope.go:117] "RemoveContainer" containerID="b37be6fcd7a63c6d3c11da956270f616d48e6352c0a7d430bb49d3f6ae84d558" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.936998 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 08:49:53 crc kubenswrapper[4810]: E1003 08:49:53.937920 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="registry-server" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.937940 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="registry-server" Oct 03 08:49:53 crc kubenswrapper[4810]: E1003 08:49:53.937964 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="extract-content" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.937972 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="extract-content" Oct 03 08:49:53 crc kubenswrapper[4810]: E1003 08:49:53.937988 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="extract-utilities" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.937995 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="extract-utilities" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.938460 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="26fae5a9-d597-453e-8636-07c2d2e73b4c" containerName="registry-server" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.939187 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.941618 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-z9qrt" Oct 03 08:49:53 crc kubenswrapper[4810]: I1003 08:49:53.948694 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.036452 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.036564 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb4qn\" (UniqueName: \"kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.137684 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.137806 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb4qn\" (UniqueName: \"kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.141212 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.141278 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fa9d6ac36ba4b686898d1d705d47adc4a45918fb0330ec4f7f1b6c91b3602114/globalmount\"" pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.160400 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb4qn\" (UniqueName: \"kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.174930 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") pod \"mariadb-copy-data\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.265863 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 08:49:54 crc kubenswrapper[4810]: I1003 08:49:54.771073 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 08:49:55 crc kubenswrapper[4810]: I1003 08:49:55.120084 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b1164f6f-490c-4ade-9824-9157b3b46be2","Type":"ContainerStarted","Data":"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68"} Oct 03 08:49:55 crc kubenswrapper[4810]: I1003 08:49:55.120259 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b1164f6f-490c-4ade-9824-9157b3b46be2","Type":"ContainerStarted","Data":"be550b9064fbc2e2ee2ab7c58ef633e383e9ad05e4dd05f9f745483b5125e271"} Oct 03 08:49:55 crc kubenswrapper[4810]: I1003 08:49:55.141274 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.14125779 podStartE2EDuration="3.14125779s" podCreationTimestamp="2025-10-03 08:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:49:55.136958096 +0000 UTC m=+6828.564208831" watchObservedRunningTime="2025-10-03 08:49:55.14125779 +0000 UTC m=+6828.568508515" Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.381292 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.383168 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.388256 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.507840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txfb\" (UniqueName: \"kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb\") pod \"mariadb-client\" (UID: \"ba8db654-4660-49a7-a4ef-26aa62ee92c6\") " pod="openstack/mariadb-client" Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.610337 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txfb\" (UniqueName: \"kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb\") pod \"mariadb-client\" (UID: \"ba8db654-4660-49a7-a4ef-26aa62ee92c6\") " pod="openstack/mariadb-client" Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.647325 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txfb\" (UniqueName: \"kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb\") pod \"mariadb-client\" (UID: \"ba8db654-4660-49a7-a4ef-26aa62ee92c6\") " pod="openstack/mariadb-client" Oct 03 08:49:57 crc kubenswrapper[4810]: I1003 08:49:57.708017 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:49:58 crc kubenswrapper[4810]: I1003 08:49:58.122711 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:49:58 crc kubenswrapper[4810]: I1003 08:49:58.149805 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ba8db654-4660-49a7-a4ef-26aa62ee92c6","Type":"ContainerStarted","Data":"cd5530db2ac4a66f118124794092392d297d1a9cd768e33c6dcccc1aa84f2a37"} Oct 03 08:49:59 crc kubenswrapper[4810]: I1003 08:49:59.158096 4810 generic.go:334] "Generic (PLEG): container finished" podID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" containerID="af32633d3f326b2eab77b701fa7cb1357a667fd6b43274f857eefef45aaf2659" exitCode=0 Oct 03 08:49:59 crc kubenswrapper[4810]: I1003 08:49:59.158136 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ba8db654-4660-49a7-a4ef-26aa62ee92c6","Type":"ContainerDied","Data":"af32633d3f326b2eab77b701fa7cb1357a667fd6b43274f857eefef45aaf2659"} Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.458384 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.477908 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ba8db654-4660-49a7-a4ef-26aa62ee92c6/mariadb-client/0.log" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.505367 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.512374 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.555782 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txfb\" (UniqueName: \"kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb\") pod \"ba8db654-4660-49a7-a4ef-26aa62ee92c6\" (UID: \"ba8db654-4660-49a7-a4ef-26aa62ee92c6\") " Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.561473 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb" (OuterVolumeSpecName: "kube-api-access-4txfb") pod "ba8db654-4660-49a7-a4ef-26aa62ee92c6" (UID: "ba8db654-4660-49a7-a4ef-26aa62ee92c6"). InnerVolumeSpecName "kube-api-access-4txfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.637249 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:00 crc kubenswrapper[4810]: E1003 08:50:00.637741 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" containerName="mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.637776 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" containerName="mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.638069 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" containerName="mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.638879 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.651812 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.658297 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4txfb\" (UniqueName: \"kubernetes.io/projected/ba8db654-4660-49a7-a4ef-26aa62ee92c6-kube-api-access-4txfb\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.759873 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6jlc\" (UniqueName: \"kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc\") pod \"mariadb-client\" (UID: \"dda1ff37-83d9-45a0-b49f-f6d52cdecf68\") " pod="openstack/mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.861670 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6jlc\" (UniqueName: \"kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc\") pod \"mariadb-client\" (UID: \"dda1ff37-83d9-45a0-b49f-f6d52cdecf68\") " pod="openstack/mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.878629 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6jlc\" (UniqueName: \"kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc\") pod \"mariadb-client\" (UID: \"dda1ff37-83d9-45a0-b49f-f6d52cdecf68\") " pod="openstack/mariadb-client" Oct 03 08:50:00 crc kubenswrapper[4810]: I1003 08:50:00.970825 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:01 crc kubenswrapper[4810]: I1003 08:50:01.177442 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd5530db2ac4a66f118124794092392d297d1a9cd768e33c6dcccc1aa84f2a37" Oct 03 08:50:01 crc kubenswrapper[4810]: I1003 08:50:01.177524 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:01 crc kubenswrapper[4810]: I1003 08:50:01.197062 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" podUID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" Oct 03 08:50:01 crc kubenswrapper[4810]: I1003 08:50:01.312134 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8db654-4660-49a7-a4ef-26aa62ee92c6" path="/var/lib/kubelet/pods/ba8db654-4660-49a7-a4ef-26aa62ee92c6/volumes" Oct 03 08:50:01 crc kubenswrapper[4810]: I1003 08:50:01.407206 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:01 crc kubenswrapper[4810]: W1003 08:50:01.427076 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddda1ff37_83d9_45a0_b49f_f6d52cdecf68.slice/crio-32e90d041c4b17cb4dd21c468a46bd8c729ce7bc9c745979a5c302944d73f688 WatchSource:0}: Error finding container 32e90d041c4b17cb4dd21c468a46bd8c729ce7bc9c745979a5c302944d73f688: Status 404 returned error can't find the container with id 32e90d041c4b17cb4dd21c468a46bd8c729ce7bc9c745979a5c302944d73f688 Oct 03 08:50:02 crc kubenswrapper[4810]: I1003 08:50:02.088581 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:50:02 crc kubenswrapper[4810]: I1003 08:50:02.088648 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:50:02 crc kubenswrapper[4810]: I1003 08:50:02.188529 4810 generic.go:334] "Generic (PLEG): container finished" podID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" containerID="d53e4485f9d8227abdd2fd4a0ac8deb08a8686275a7607b113f97741a0991768" exitCode=0 Oct 03 08:50:02 crc kubenswrapper[4810]: I1003 08:50:02.188622 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dda1ff37-83d9-45a0-b49f-f6d52cdecf68","Type":"ContainerDied","Data":"d53e4485f9d8227abdd2fd4a0ac8deb08a8686275a7607b113f97741a0991768"} Oct 03 08:50:02 crc kubenswrapper[4810]: I1003 08:50:02.189027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dda1ff37-83d9-45a0-b49f-f6d52cdecf68","Type":"ContainerStarted","Data":"32e90d041c4b17cb4dd21c468a46bd8c729ce7bc9c745979a5c302944d73f688"} Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.471647 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.493152 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_dda1ff37-83d9-45a0-b49f-f6d52cdecf68/mariadb-client/0.log" Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.522696 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.527224 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.607134 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6jlc\" (UniqueName: \"kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc\") pod \"dda1ff37-83d9-45a0-b49f-f6d52cdecf68\" (UID: \"dda1ff37-83d9-45a0-b49f-f6d52cdecf68\") " Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.643858 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc" (OuterVolumeSpecName: "kube-api-access-h6jlc") pod "dda1ff37-83d9-45a0-b49f-f6d52cdecf68" (UID: "dda1ff37-83d9-45a0-b49f-f6d52cdecf68"). InnerVolumeSpecName "kube-api-access-h6jlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:03 crc kubenswrapper[4810]: I1003 08:50:03.708990 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6jlc\" (UniqueName: \"kubernetes.io/projected/dda1ff37-83d9-45a0-b49f-f6d52cdecf68-kube-api-access-h6jlc\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:04 crc kubenswrapper[4810]: I1003 08:50:04.204594 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e90d041c4b17cb4dd21c468a46bd8c729ce7bc9c745979a5c302944d73f688" Oct 03 08:50:04 crc kubenswrapper[4810]: I1003 08:50:04.204664 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 03 08:50:05 crc kubenswrapper[4810]: I1003 08:50:05.313852 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" path="/var/lib/kubelet/pods/dda1ff37-83d9-45a0-b49f-f6d52cdecf68/volumes" Oct 03 08:50:32 crc kubenswrapper[4810]: I1003 08:50:32.088637 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:50:32 crc kubenswrapper[4810]: I1003 08:50:32.089284 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.909070 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:36 crc kubenswrapper[4810]: E1003 08:50:36.910697 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" containerName="mariadb-client" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.910779 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" containerName="mariadb-client" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.911024 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda1ff37-83d9-45a0-b49f-f6d52cdecf68" containerName="mariadb-client" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.912190 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.923682 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.939220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.939323 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmbft\" (UniqueName: \"kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.939394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.962609 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.964245 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.977699 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.977990 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.978052 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.978162 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.978300 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-t47ks" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.978565 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.995985 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 08:50:36 crc kubenswrapper[4810]: I1003 08:50:36.997555 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.005926 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.007528 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.014802 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.038373 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040098 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040154 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040180 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040205 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040227 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040278 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040332 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040355 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040385 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmbft\" (UniqueName: \"kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040427 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040448 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040465 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbzd\" (UniqueName: \"kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040536 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040569 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040593 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040617 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.040639 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6qnz\" (UniqueName: \"kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.041170 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.041955 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.064823 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmbft\" (UniqueName: \"kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft\") pod \"redhat-operators-wlzgh\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142485 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142557 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142665 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142719 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142748 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142784 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142821 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6qnz\" (UniqueName: \"kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142844 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.142978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143037 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143074 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143263 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143302 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143341 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143530 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb64m\" (UniqueName: \"kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143601 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbzd\" (UniqueName: \"kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143682 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143729 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.143969 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.144221 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.145015 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.147062 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.148218 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.148565 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.148760 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.149126 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.155165 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.160576 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.163279 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.163316 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/298dffc4fe0d60da6a39ad90a436eb9fcbe4d976f133834a030858052d393d3b/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.165774 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbzd\" (UniqueName: \"kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.166193 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.166225 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bb28df444c67fa5b2a33cccd89af8de803d1f5b374485161806834c065317704/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.167031 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6qnz\" (UniqueName: \"kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.210913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") pod \"ovsdbserver-nb-1\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.219275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") pod \"ovsdbserver-nb-0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.235206 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246202 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246379 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb64m\" (UniqueName: \"kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246435 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246491 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246521 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.246551 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.247027 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.247613 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.248506 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.250744 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.250776 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.250983 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.251021 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b470b523cab47d4ad3dd66ebd837f90a0a3e43bb61a6917a7b778bcb42cf206a/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.251519 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.268020 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb64m\" (UniqueName: \"kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.288687 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") pod \"ovsdbserver-nb-2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.291375 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.328010 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.340430 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.705088 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.752587 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:50:37 crc kubenswrapper[4810]: I1003 08:50:37.780424 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.014326 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 08:50:38 crc kubenswrapper[4810]: W1003 08:50:38.022629 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb66def_1afa_46cc_bacf_2f6d3f8ed245.slice/crio-70739d00dabfb16e5084c88aa8c6304bb7a5a67d4f688d37017c1e9fd67bf1b4 WatchSource:0}: Error finding container 70739d00dabfb16e5084c88aa8c6304bb7a5a67d4f688d37017c1e9fd67bf1b4: Status 404 returned error can't find the container with id 70739d00dabfb16e5084c88aa8c6304bb7a5a67d4f688d37017c1e9fd67bf1b4 Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.456908 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerStarted","Data":"b77877208c57c7ee2fd86d1c19759438a454c9c5f8343d78023e9fd3b557aac8"} Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.458651 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerStarted","Data":"70739d00dabfb16e5084c88aa8c6304bb7a5a67d4f688d37017c1e9fd67bf1b4"} Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.461153 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf51c24-867c-4821-b43c-8d5733177841" containerID="0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4" exitCode=0 Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.461207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerDied","Data":"0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4"} Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.461235 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerStarted","Data":"1cb72395f2799c04dfb0dc71eab2a34ed6a94bba121fb61b402c25fcef6db8ab"} Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.595808 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.604093 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.608069 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.608478 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6l74s" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.608677 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.611843 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.613092 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.620980 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.622693 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.665130 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.671207 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.678812 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.689231 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775498 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gxws\" (UniqueName: \"kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775552 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775593 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq7bk\" (UniqueName: \"kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775736 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775811 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775928 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.775970 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776045 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5ld\" (UniqueName: \"kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776154 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776217 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.776702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777246 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777272 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777290 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777323 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777344 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.777533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.879836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq7bk\" (UniqueName: \"kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.879969 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880000 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880022 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880055 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880084 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880112 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880141 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880166 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5ld\" (UniqueName: \"kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880204 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880258 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880282 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880302 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880331 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880352 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880369 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880389 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880415 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880436 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880458 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880482 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880508 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gxws\" (UniqueName: \"kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.880536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.890135 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.892756 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.893023 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.893063 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.893329 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.893390 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.894626 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.895605 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.898337 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.898627 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.899584 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.903012 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.908202 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.908406 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.910686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.914712 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915503 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915602 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/108520b79c8428bc460c4c771a2889c3e654a6e154b861040adea522a3c0a2d0/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915524 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915748 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915868 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0ed32caecd8ef6431a28077495ef118cb8d704ab7cf69a9b645a6678a0e9534a/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.915757 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.916084 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6d2cbba667654c653456084b3b1b5d5c39bebc177f0ecd9bc91ee055e83a83da/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.920604 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.930407 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq7bk\" (UniqueName: \"kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.944305 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5ld\" (UniqueName: \"kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:38 crc kubenswrapper[4810]: I1003 08:50:38.961806 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gxws\" (UniqueName: \"kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.134459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") pod \"ovsdbserver-sb-1\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.136069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") pod \"ovsdbserver-sb-0\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.142019 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") pod \"ovsdbserver-sb-2\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.164770 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.243958 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.261221 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.296460 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.475920 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerStarted","Data":"5acfcf4c9e7728e5a2cd0c1df052b3599672e216953a30120cfe60feb2075668"} Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.816110 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:50:39 crc kubenswrapper[4810]: I1003 08:50:39.912350 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 08:50:39 crc kubenswrapper[4810]: W1003 08:50:39.919392 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffece18b_f8fb_403a_8500_e00e53d9856e.slice/crio-430dfc21522fc77148d10bdee3fb431fac9e74408515e2ce3900f2942cfa23d4 WatchSource:0}: Error finding container 430dfc21522fc77148d10bdee3fb431fac9e74408515e2ce3900f2942cfa23d4: Status 404 returned error can't find the container with id 430dfc21522fc77148d10bdee3fb431fac9e74408515e2ce3900f2942cfa23d4 Oct 03 08:50:40 crc kubenswrapper[4810]: I1003 08:50:40.487258 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerStarted","Data":"3339f686fca5904af2854a31580acf6e1f8d8467c59651449bb5f64aed5dc2ea"} Oct 03 08:50:40 crc kubenswrapper[4810]: I1003 08:50:40.488789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerStarted","Data":"430dfc21522fc77148d10bdee3fb431fac9e74408515e2ce3900f2942cfa23d4"} Oct 03 08:50:40 crc kubenswrapper[4810]: I1003 08:50:40.491541 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf51c24-867c-4821-b43c-8d5733177841" containerID="e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459" exitCode=0 Oct 03 08:50:40 crc kubenswrapper[4810]: I1003 08:50:40.491571 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerDied","Data":"e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459"} Oct 03 08:50:40 crc kubenswrapper[4810]: I1003 08:50:40.684469 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.508820 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerStarted","Data":"448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.511496 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerStarted","Data":"a3d3c9e0db1422a18299e83c0a317086eace0fbe45d5f64b129666d18947412f"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.511523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerStarted","Data":"48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.514350 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerStarted","Data":"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.516341 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerStarted","Data":"5dae80ce7a83165b8d3275fddb9ca028f74cb3e827dba425ed4db81bb9412881"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.518604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerStarted","Data":"7bb028b7a5bb063820d9ba2accea2471b634b11658dc72ef39f28643120fa6d7"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.518639 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerStarted","Data":"6c91709fb1869d8c7dcdbb6f88a006eabccffee810f357690eff8446064838c1"} Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.535813 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.972059392 podStartE2EDuration="7.535798853s" podCreationTimestamp="2025-10-03 08:50:35 +0000 UTC" firstStartedPulling="2025-10-03 08:50:39.172466989 +0000 UTC m=+6872.599717724" lastFinishedPulling="2025-10-03 08:50:41.73620645 +0000 UTC m=+6875.163457185" observedRunningTime="2025-10-03 08:50:42.534607741 +0000 UTC m=+6875.961858496" watchObservedRunningTime="2025-10-03 08:50:42.535798853 +0000 UTC m=+6875.963049588" Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.576780 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wlzgh" podStartSLOduration=3.082019943 podStartE2EDuration="6.576765463s" podCreationTimestamp="2025-10-03 08:50:36 +0000 UTC" firstStartedPulling="2025-10-03 08:50:38.463552171 +0000 UTC m=+6871.890802926" lastFinishedPulling="2025-10-03 08:50:41.958297711 +0000 UTC m=+6875.385548446" observedRunningTime="2025-10-03 08:50:42.576720422 +0000 UTC m=+6876.003971157" watchObservedRunningTime="2025-10-03 08:50:42.576765463 +0000 UTC m=+6876.004016198" Oct 03 08:50:42 crc kubenswrapper[4810]: I1003 08:50:42.580115 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.6376337960000003 podStartE2EDuration="7.580105492s" podCreationTimestamp="2025-10-03 08:50:35 +0000 UTC" firstStartedPulling="2025-10-03 08:50:37.780214733 +0000 UTC m=+6871.207465468" lastFinishedPulling="2025-10-03 08:50:41.722686419 +0000 UTC m=+6875.149937164" observedRunningTime="2025-10-03 08:50:42.553104843 +0000 UTC m=+6875.980355578" watchObservedRunningTime="2025-10-03 08:50:42.580105492 +0000 UTC m=+6876.007356217" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.292130 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.340855 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.533664 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerStarted","Data":"0e74de867618f048890730ae1b8ae54e811efdf031ec6aee30450263081817ad"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.533713 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerStarted","Data":"ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.536523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerStarted","Data":"9f40d85da6ecfa7849d414ba4e3f550a824b8bba334f771ee2af8a049f0905e5"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.536580 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerStarted","Data":"c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.539470 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerStarted","Data":"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.539539 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerStarted","Data":"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.543707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerStarted","Data":"9e8625267188d3164afef69fbc3a4e9502bd91a19d31086839f399a5a43e1daa"} Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.557305 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.874157894 podStartE2EDuration="6.557284172s" podCreationTimestamp="2025-10-03 08:50:37 +0000 UTC" firstStartedPulling="2025-10-03 08:50:39.852838139 +0000 UTC m=+6873.280088874" lastFinishedPulling="2025-10-03 08:50:42.535964417 +0000 UTC m=+6875.963215152" observedRunningTime="2025-10-03 08:50:43.549022532 +0000 UTC m=+6876.976273277" watchObservedRunningTime="2025-10-03 08:50:43.557284172 +0000 UTC m=+6876.984534907" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.576036 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.871794163 podStartE2EDuration="8.57601777s" podCreationTimestamp="2025-10-03 08:50:35 +0000 UTC" firstStartedPulling="2025-10-03 08:50:38.02623177 +0000 UTC m=+6871.453482495" lastFinishedPulling="2025-10-03 08:50:41.730455367 +0000 UTC m=+6875.157706102" observedRunningTime="2025-10-03 08:50:43.571619533 +0000 UTC m=+6876.998870278" watchObservedRunningTime="2025-10-03 08:50:43.57601777 +0000 UTC m=+6877.003268505" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.593809 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.827724248 podStartE2EDuration="6.593791534s" podCreationTimestamp="2025-10-03 08:50:37 +0000 UTC" firstStartedPulling="2025-10-03 08:50:39.921656311 +0000 UTC m=+6873.348907046" lastFinishedPulling="2025-10-03 08:50:42.687723597 +0000 UTC m=+6876.114974332" observedRunningTime="2025-10-03 08:50:43.593022813 +0000 UTC m=+6877.020273568" watchObservedRunningTime="2025-10-03 08:50:43.593791534 +0000 UTC m=+6877.021042269" Oct 03 08:50:43 crc kubenswrapper[4810]: I1003 08:50:43.614086 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=5.783655289 podStartE2EDuration="6.614067193s" podCreationTimestamp="2025-10-03 08:50:37 +0000 UTC" firstStartedPulling="2025-10-03 08:50:41.689529727 +0000 UTC m=+6875.116780462" lastFinishedPulling="2025-10-03 08:50:42.519941621 +0000 UTC m=+6875.947192366" observedRunningTime="2025-10-03 08:50:43.609475521 +0000 UTC m=+6877.036726276" watchObservedRunningTime="2025-10-03 08:50:43.614067193 +0000 UTC m=+6877.041317918" Oct 03 08:50:44 crc kubenswrapper[4810]: I1003 08:50:44.244963 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:44 crc kubenswrapper[4810]: I1003 08:50:44.261821 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:44 crc kubenswrapper[4810]: I1003 08:50:44.297020 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:45 crc kubenswrapper[4810]: I1003 08:50:45.244613 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:45 crc kubenswrapper[4810]: I1003 08:50:45.262233 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:45 crc kubenswrapper[4810]: I1003 08:50:45.288170 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:45 crc kubenswrapper[4810]: I1003 08:50:45.297388 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:45 crc kubenswrapper[4810]: I1003 08:50:45.329635 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.329422 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.333390 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.333868 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.368867 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.379136 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.381685 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.382228 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.427345 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.554342 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.556028 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.558802 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.561039 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.565078 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.607494 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.649161 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.649227 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncsrm\" (UniqueName: \"kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.649265 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.649482 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.750862 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.750965 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncsrm\" (UniqueName: \"kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.751074 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.751204 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.751927 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.751971 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.752221 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.770571 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncsrm\" (UniqueName: \"kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm\") pod \"dnsmasq-dns-7b6c679f77-5fmgg\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:46 crc kubenswrapper[4810]: I1003 08:50:46.893392 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.236113 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.236582 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.300521 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.388496 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:47 crc kubenswrapper[4810]: W1003 08:50:47.399426 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84768ee9_029f_4b7a_8f65_2fc0e5c3fb3e.slice/crio-7583c5bc7b57e7b9175511ead5d7d20d02401b7eeeb08e5be9b18684a7736f97 WatchSource:0}: Error finding container 7583c5bc7b57e7b9175511ead5d7d20d02401b7eeeb08e5be9b18684a7736f97: Status 404 returned error can't find the container with id 7583c5bc7b57e7b9175511ead5d7d20d02401b7eeeb08e5be9b18684a7736f97 Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.572720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" event={"ID":"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e","Type":"ContainerStarted","Data":"7583c5bc7b57e7b9175511ead5d7d20d02401b7eeeb08e5be9b18684a7736f97"} Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.615659 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:47 crc kubenswrapper[4810]: I1003 08:50:47.664870 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.334211 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.376926 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.584379 4810 generic.go:334] "Generic (PLEG): container finished" podID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerID="3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234" exitCode=0 Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.584987 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" event={"ID":"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e","Type":"ContainerDied","Data":"3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234"} Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.672359 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.713045 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.714360 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.719239 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.730449 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.791588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gg6g\" (UniqueName: \"kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.791665 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.791823 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.791859 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.792070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.893340 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.893425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.893473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.893564 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.893664 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gg6g\" (UniqueName: \"kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.894387 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.894504 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.894525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.894842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:48 crc kubenswrapper[4810]: I1003 08:50:48.910699 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gg6g\" (UniqueName: \"kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g\") pod \"dnsmasq-dns-868956d5bc-hr8mz\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.066058 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.314424 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.315013 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.507375 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.595677 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" event={"ID":"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e","Type":"ContainerStarted","Data":"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299"} Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.596634 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.596654 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="dnsmasq-dns" containerID="cri-o://5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299" gracePeriod=10 Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.597016 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" event={"ID":"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1","Type":"ContainerStarted","Data":"7313e625c8ee0443d0d7e91bdef29227b7859e69a8a565dd4f34017717cd45e2"} Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.597149 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlzgh" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="registry-server" containerID="cri-o://2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05" gracePeriod=2 Oct 03 08:50:49 crc kubenswrapper[4810]: I1003 08:50:49.624704 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" podStartSLOduration=3.6246812459999997 podStartE2EDuration="3.624681246s" podCreationTimestamp="2025-10-03 08:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:50:49.617522266 +0000 UTC m=+6883.044773001" watchObservedRunningTime="2025-10-03 08:50:49.624681246 +0000 UTC m=+6883.051931981" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.100611 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.110069 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217536 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmbft\" (UniqueName: \"kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft\") pod \"dbf51c24-867c-4821-b43c-8d5733177841\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217623 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content\") pod \"dbf51c24-867c-4821-b43c-8d5733177841\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217683 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config\") pod \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217796 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc\") pod \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217834 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncsrm\" (UniqueName: \"kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm\") pod \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217863 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb\") pod \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\" (UID: \"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.217941 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities\") pod \"dbf51c24-867c-4821-b43c-8d5733177841\" (UID: \"dbf51c24-867c-4821-b43c-8d5733177841\") " Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.219499 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities" (OuterVolumeSpecName: "utilities") pod "dbf51c24-867c-4821-b43c-8d5733177841" (UID: "dbf51c24-867c-4821-b43c-8d5733177841"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.223984 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft" (OuterVolumeSpecName: "kube-api-access-gmbft") pod "dbf51c24-867c-4821-b43c-8d5733177841" (UID: "dbf51c24-867c-4821-b43c-8d5733177841"). InnerVolumeSpecName "kube-api-access-gmbft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.224131 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm" (OuterVolumeSpecName: "kube-api-access-ncsrm") pod "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" (UID: "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e"). InnerVolumeSpecName "kube-api-access-ncsrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.261664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" (UID: "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.265108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config" (OuterVolumeSpecName: "config") pod "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" (UID: "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.265782 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" (UID: "84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319548 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319582 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncsrm\" (UniqueName: \"kubernetes.io/projected/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-kube-api-access-ncsrm\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319593 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319604 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319634 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmbft\" (UniqueName: \"kubernetes.io/projected/dbf51c24-867c-4821-b43c-8d5733177841-kube-api-access-gmbft\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.319643 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.606761 4810 generic.go:334] "Generic (PLEG): container finished" podID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerID="5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299" exitCode=0 Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.606848 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.606840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" event={"ID":"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e","Type":"ContainerDied","Data":"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299"} Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.606941 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6c679f77-5fmgg" event={"ID":"84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e","Type":"ContainerDied","Data":"7583c5bc7b57e7b9175511ead5d7d20d02401b7eeeb08e5be9b18684a7736f97"} Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.606968 4810 scope.go:117] "RemoveContainer" containerID="5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.612325 4810 generic.go:334] "Generic (PLEG): container finished" podID="dbf51c24-867c-4821-b43c-8d5733177841" containerID="2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05" exitCode=0 Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.612523 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerDied","Data":"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05"} Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.612665 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlzgh" event={"ID":"dbf51c24-867c-4821-b43c-8d5733177841","Type":"ContainerDied","Data":"1cb72395f2799c04dfb0dc71eab2a34ed6a94bba121fb61b402c25fcef6db8ab"} Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.612880 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlzgh" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.616092 4810 generic.go:334] "Generic (PLEG): container finished" podID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerID="4b88520c3b3e0c7afd2c46fd692f07f64a2ae7a44511b94f59d104bbd98fc346" exitCode=0 Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.616120 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" event={"ID":"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1","Type":"ContainerDied","Data":"4b88520c3b3e0c7afd2c46fd692f07f64a2ae7a44511b94f59d104bbd98fc346"} Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.642146 4810 scope.go:117] "RemoveContainer" containerID="3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.659313 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.666158 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b6c679f77-5fmgg"] Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.671412 4810 scope.go:117] "RemoveContainer" containerID="5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299" Oct 03 08:50:50 crc kubenswrapper[4810]: E1003 08:50:50.671680 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299\": container with ID starting with 5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299 not found: ID does not exist" containerID="5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.671713 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299"} err="failed to get container status \"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299\": rpc error: code = NotFound desc = could not find container \"5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299\": container with ID starting with 5329a4cce772f9595b25e43d9a74a9f905c0631925ff20cd1ce156415cccd299 not found: ID does not exist" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.671743 4810 scope.go:117] "RemoveContainer" containerID="3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234" Oct 03 08:50:50 crc kubenswrapper[4810]: E1003 08:50:50.672012 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234\": container with ID starting with 3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234 not found: ID does not exist" containerID="3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.672051 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234"} err="failed to get container status \"3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234\": rpc error: code = NotFound desc = could not find container \"3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234\": container with ID starting with 3bfe2bc84e87230829d40b7118d43b3a57c6369648b030c797ec914fc1177234 not found: ID does not exist" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.672081 4810 scope.go:117] "RemoveContainer" containerID="2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.705636 4810 scope.go:117] "RemoveContainer" containerID="e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.736194 4810 scope.go:117] "RemoveContainer" containerID="0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.762606 4810 scope.go:117] "RemoveContainer" containerID="2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05" Oct 03 08:50:50 crc kubenswrapper[4810]: E1003 08:50:50.763099 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05\": container with ID starting with 2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05 not found: ID does not exist" containerID="2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.763128 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05"} err="failed to get container status \"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05\": rpc error: code = NotFound desc = could not find container \"2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05\": container with ID starting with 2d9f297184d388992bf36302415c630e6ba6cde6ba16cd4a3d07d6a20a394f05 not found: ID does not exist" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.763149 4810 scope.go:117] "RemoveContainer" containerID="e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459" Oct 03 08:50:50 crc kubenswrapper[4810]: E1003 08:50:50.763747 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459\": container with ID starting with e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459 not found: ID does not exist" containerID="e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.763867 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459"} err="failed to get container status \"e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459\": rpc error: code = NotFound desc = could not find container \"e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459\": container with ID starting with e0e46f4d8a40ab75bde38abce55cc43a8842aab1fa9ea060ddd07446ff458459 not found: ID does not exist" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.764011 4810 scope.go:117] "RemoveContainer" containerID="0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4" Oct 03 08:50:50 crc kubenswrapper[4810]: E1003 08:50:50.764434 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4\": container with ID starting with 0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4 not found: ID does not exist" containerID="0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4" Oct 03 08:50:50 crc kubenswrapper[4810]: I1003 08:50:50.764458 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4"} err="failed to get container status \"0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4\": rpc error: code = NotFound desc = could not find container \"0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4\": container with ID starting with 0d25b7bebec509a0aedd6dab430b0f8d0dd3c0163607731067c52c70215e7de4 not found: ID does not exist" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.314038 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" path="/var/lib/kubelet/pods/84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e/volumes" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.555516 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbf51c24-867c-4821-b43c-8d5733177841" (UID: "dbf51c24-867c-4821-b43c-8d5733177841"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.631390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" event={"ID":"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1","Type":"ContainerStarted","Data":"b33f071bdda3c521a02cc348fe4ec31841c70429a49d1c240f3effae63cc45e7"} Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.631642 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.639501 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf51c24-867c-4821-b43c-8d5733177841-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.655449 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" podStartSLOduration=3.655427757 podStartE2EDuration="3.655427757s" podCreationTimestamp="2025-10-03 08:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:50:51.648356479 +0000 UTC m=+6885.075607244" watchObservedRunningTime="2025-10-03 08:50:51.655427757 +0000 UTC m=+6885.082678492" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.845545 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.852548 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlzgh"] Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.919954 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 03 08:50:51 crc kubenswrapper[4810]: E1003 08:50:51.920459 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="extract-utilities" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.920556 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="extract-utilities" Oct 03 08:50:51 crc kubenswrapper[4810]: E1003 08:50:51.920626 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="registry-server" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.920688 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="registry-server" Oct 03 08:50:51 crc kubenswrapper[4810]: E1003 08:50:51.920750 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="extract-content" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.920818 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="extract-content" Oct 03 08:50:51 crc kubenswrapper[4810]: E1003 08:50:51.920911 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="init" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.920967 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="init" Oct 03 08:50:51 crc kubenswrapper[4810]: E1003 08:50:51.921043 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="dnsmasq-dns" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.921098 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="dnsmasq-dns" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.921301 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf51c24-867c-4821-b43c-8d5733177841" containerName="registry-server" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.921369 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="84768ee9-029f-4b7a-8f65-2fc0e5c3fb3e" containerName="dnsmasq-dns" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.922011 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.925733 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 03 08:50:51 crc kubenswrapper[4810]: I1003 08:50:51.930113 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.044590 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4xjv\" (UniqueName: \"kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.044671 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.044710 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.146467 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4xjv\" (UniqueName: \"kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.146636 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.146686 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.152765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.154631 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.154678 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ad9fb3074f4df67dcdbc4efe9a1f1627dbb431103630a784e0d0f3737030b47/globalmount\"" pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.172071 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4xjv\" (UniqueName: \"kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.202744 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") pod \"ovn-copy-data\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.237701 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 08:50:52 crc kubenswrapper[4810]: I1003 08:50:52.760722 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 08:50:52 crc kubenswrapper[4810]: W1003 08:50:52.763633 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ad2114_1f17_4dee_ba01_428f6f8a33d8.slice/crio-3a070ab9963c69ee57b08d9c2a160316baf19602b47c5022bf3be179f1c0db8c WatchSource:0}: Error finding container 3a070ab9963c69ee57b08d9c2a160316baf19602b47c5022bf3be179f1c0db8c: Status 404 returned error can't find the container with id 3a070ab9963c69ee57b08d9c2a160316baf19602b47c5022bf3be179f1c0db8c Oct 03 08:50:53 crc kubenswrapper[4810]: I1003 08:50:53.311743 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf51c24-867c-4821-b43c-8d5733177841" path="/var/lib/kubelet/pods/dbf51c24-867c-4821-b43c-8d5733177841/volumes" Oct 03 08:50:53 crc kubenswrapper[4810]: I1003 08:50:53.669072 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"c2ad2114-1f17-4dee-ba01-428f6f8a33d8","Type":"ContainerStarted","Data":"44a3da3283de87f0f425a172444a706cfee46d2186e7801031398a2bcfb7e8dc"} Oct 03 08:50:53 crc kubenswrapper[4810]: I1003 08:50:53.669486 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"c2ad2114-1f17-4dee-ba01-428f6f8a33d8","Type":"ContainerStarted","Data":"3a070ab9963c69ee57b08d9c2a160316baf19602b47c5022bf3be179f1c0db8c"} Oct 03 08:50:53 crc kubenswrapper[4810]: I1003 08:50:53.716108 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.121959111 podStartE2EDuration="3.716082655s" podCreationTimestamp="2025-10-03 08:50:50 +0000 UTC" firstStartedPulling="2025-10-03 08:50:52.765625877 +0000 UTC m=+6886.192876652" lastFinishedPulling="2025-10-03 08:50:53.359749451 +0000 UTC m=+6886.787000196" observedRunningTime="2025-10-03 08:50:53.712260473 +0000 UTC m=+6887.139511208" watchObservedRunningTime="2025-10-03 08:50:53.716082655 +0000 UTC m=+6887.143333390" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.068349 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.120152 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.120467 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="dnsmasq-dns" containerID="cri-o://2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4" gracePeriod=10 Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.678460 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.727381 4810 generic.go:334] "Generic (PLEG): container finished" podID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerID="2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4" exitCode=0 Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.727420 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" event={"ID":"9a8db38e-fd2f-478f-97c7-52e6fb134f80","Type":"ContainerDied","Data":"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4"} Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.727451 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" event={"ID":"9a8db38e-fd2f-478f-97c7-52e6fb134f80","Type":"ContainerDied","Data":"100ab376fbe543d7d122170c95566aba8db2f7537820fb203ad9c1d48d4891ed"} Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.727470 4810 scope.go:117] "RemoveContainer" containerID="2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.727584 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96d5866c7-z65nq" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.753915 4810 scope.go:117] "RemoveContainer" containerID="e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.776531 4810 scope.go:117] "RemoveContainer" containerID="2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4" Oct 03 08:50:59 crc kubenswrapper[4810]: E1003 08:50:59.785093 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4\": container with ID starting with 2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4 not found: ID does not exist" containerID="2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.785337 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4"} err="failed to get container status \"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4\": rpc error: code = NotFound desc = could not find container \"2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4\": container with ID starting with 2c232177f883d1b37fbbef5bb6c65b32065885d2174a4284cffc2bb352838ad4 not found: ID does not exist" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.785497 4810 scope.go:117] "RemoveContainer" containerID="e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473" Oct 03 08:50:59 crc kubenswrapper[4810]: E1003 08:50:59.786712 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473\": container with ID starting with e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473 not found: ID does not exist" containerID="e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.786754 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473"} err="failed to get container status \"e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473\": rpc error: code = NotFound desc = could not find container \"e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473\": container with ID starting with e463968c1ecf2c30b1451725a4437eb27b7942758a8e724c2986049cc0e88473 not found: ID does not exist" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.822088 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvr9d\" (UniqueName: \"kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d\") pod \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.822214 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config\") pod \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.822276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc\") pod \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\" (UID: \"9a8db38e-fd2f-478f-97c7-52e6fb134f80\") " Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.827681 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d" (OuterVolumeSpecName: "kube-api-access-nvr9d") pod "9a8db38e-fd2f-478f-97c7-52e6fb134f80" (UID: "9a8db38e-fd2f-478f-97c7-52e6fb134f80"). InnerVolumeSpecName "kube-api-access-nvr9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.864364 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config" (OuterVolumeSpecName: "config") pod "9a8db38e-fd2f-478f-97c7-52e6fb134f80" (UID: "9a8db38e-fd2f-478f-97c7-52e6fb134f80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.865008 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a8db38e-fd2f-478f-97c7-52e6fb134f80" (UID: "9a8db38e-fd2f-478f-97c7-52e6fb134f80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.924727 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvr9d\" (UniqueName: \"kubernetes.io/projected/9a8db38e-fd2f-478f-97c7-52e6fb134f80-kube-api-access-nvr9d\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.924809 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:50:59 crc kubenswrapper[4810]: I1003 08:50:59.924820 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a8db38e-fd2f-478f-97c7-52e6fb134f80-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:00 crc kubenswrapper[4810]: I1003 08:51:00.061280 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:51:00 crc kubenswrapper[4810]: I1003 08:51:00.069347 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-96d5866c7-z65nq"] Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.321029 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" path="/var/lib/kubelet/pods/9a8db38e-fd2f-478f-97c7-52e6fb134f80/volumes" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.393144 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:51:01 crc kubenswrapper[4810]: E1003 08:51:01.396508 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="dnsmasq-dns" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.396550 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="dnsmasq-dns" Oct 03 08:51:01 crc kubenswrapper[4810]: E1003 08:51:01.396572 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="init" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.396580 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="init" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.396780 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a8db38e-fd2f-478f-97c7-52e6fb134f80" containerName="dnsmasq-dns" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.400392 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.404550 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.404989 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.405165 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.411569 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lrbnr" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.453092 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.565626 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.565693 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.565763 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.565927 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgn2w\" (UniqueName: \"kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.566006 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.566047 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.566080 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667440 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667760 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667817 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667865 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgn2w\" (UniqueName: \"kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667915 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667945 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.667971 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.668250 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.669096 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.669230 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.676134 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.676134 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.681511 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.691151 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgn2w\" (UniqueName: \"kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w\") pod \"ovn-northd-0\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " pod="openstack/ovn-northd-0" Oct 03 08:51:01 crc kubenswrapper[4810]: I1003 08:51:01.753481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.089145 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.089603 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.089649 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.090523 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.090586 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1" gracePeriod=600 Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.233732 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:51:02 crc kubenswrapper[4810]: W1003 08:51:02.252707 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb2735e6_11ae_45cc_94fe_9628468a810a.slice/crio-845ec6c9618d658b236e8d9404df56719ec237542322fc2b2c727c24ae48340d WatchSource:0}: Error finding container 845ec6c9618d658b236e8d9404df56719ec237542322fc2b2c727c24ae48340d: Status 404 returned error can't find the container with id 845ec6c9618d658b236e8d9404df56719ec237542322fc2b2c727c24ae48340d Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.760554 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerStarted","Data":"845ec6c9618d658b236e8d9404df56719ec237542322fc2b2c727c24ae48340d"} Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.767955 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1" exitCode=0 Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.768051 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1"} Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.768140 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99"} Oct 03 08:51:02 crc kubenswrapper[4810]: I1003 08:51:02.768164 4810 scope.go:117] "RemoveContainer" containerID="3aaba18b81ddd96a1deb40971118e6b4354fc47ddc2fd845858b078806f48d84" Oct 03 08:51:03 crc kubenswrapper[4810]: I1003 08:51:03.778428 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerStarted","Data":"1de9cdf0c83a3fe94c13ec3cd20f602220b80934411f5a0d8ab59ac3fce98075"} Oct 03 08:51:03 crc kubenswrapper[4810]: I1003 08:51:03.779430 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 08:51:03 crc kubenswrapper[4810]: I1003 08:51:03.779455 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerStarted","Data":"68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d"} Oct 03 08:51:03 crc kubenswrapper[4810]: I1003 08:51:03.806646 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.9249592340000001 podStartE2EDuration="2.806618382s" podCreationTimestamp="2025-10-03 08:51:01 +0000 UTC" firstStartedPulling="2025-10-03 08:51:02.255731772 +0000 UTC m=+6895.682982507" lastFinishedPulling="2025-10-03 08:51:03.13739092 +0000 UTC m=+6896.564641655" observedRunningTime="2025-10-03 08:51:03.80242085 +0000 UTC m=+6897.229671585" watchObservedRunningTime="2025-10-03 08:51:03.806618382 +0000 UTC m=+6897.233869117" Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.196584 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lmhgp"] Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.198832 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.213807 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmhgp"] Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.347974 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfp7\" (UniqueName: \"kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7\") pod \"keystone-db-create-lmhgp\" (UID: \"66eb28ac-c56f-4700-9faa-85e3327c303b\") " pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.451784 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfp7\" (UniqueName: \"kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7\") pod \"keystone-db-create-lmhgp\" (UID: \"66eb28ac-c56f-4700-9faa-85e3327c303b\") " pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.479023 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfp7\" (UniqueName: \"kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7\") pod \"keystone-db-create-lmhgp\" (UID: \"66eb28ac-c56f-4700-9faa-85e3327c303b\") " pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:09 crc kubenswrapper[4810]: I1003 08:51:09.538746 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:10 crc kubenswrapper[4810]: W1003 08:51:10.095456 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66eb28ac_c56f_4700_9faa_85e3327c303b.slice/crio-c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48 WatchSource:0}: Error finding container c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48: Status 404 returned error can't find the container with id c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48 Oct 03 08:51:10 crc kubenswrapper[4810]: I1003 08:51:10.099505 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lmhgp"] Oct 03 08:51:10 crc kubenswrapper[4810]: I1003 08:51:10.855038 4810 generic.go:334] "Generic (PLEG): container finished" podID="66eb28ac-c56f-4700-9faa-85e3327c303b" containerID="16eb23acd136f97ac0bb85b390a4ff4943be9ff9d066f146c47f808111970b89" exitCode=0 Oct 03 08:51:10 crc kubenswrapper[4810]: I1003 08:51:10.855136 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmhgp" event={"ID":"66eb28ac-c56f-4700-9faa-85e3327c303b","Type":"ContainerDied","Data":"16eb23acd136f97ac0bb85b390a4ff4943be9ff9d066f146c47f808111970b89"} Oct 03 08:51:10 crc kubenswrapper[4810]: I1003 08:51:10.855604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmhgp" event={"ID":"66eb28ac-c56f-4700-9faa-85e3327c303b","Type":"ContainerStarted","Data":"c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48"} Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.223680 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.310153 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dfp7\" (UniqueName: \"kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7\") pod \"66eb28ac-c56f-4700-9faa-85e3327c303b\" (UID: \"66eb28ac-c56f-4700-9faa-85e3327c303b\") " Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.320055 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7" (OuterVolumeSpecName: "kube-api-access-5dfp7") pod "66eb28ac-c56f-4700-9faa-85e3327c303b" (UID: "66eb28ac-c56f-4700-9faa-85e3327c303b"). InnerVolumeSpecName "kube-api-access-5dfp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.413251 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dfp7\" (UniqueName: \"kubernetes.io/projected/66eb28ac-c56f-4700-9faa-85e3327c303b-kube-api-access-5dfp7\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.876956 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lmhgp" event={"ID":"66eb28ac-c56f-4700-9faa-85e3327c303b","Type":"ContainerDied","Data":"c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48"} Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.877010 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c25bd713f0755fe6744172a4e82186a9e6d92b0b3b1619249865ddb7cb449e48" Oct 03 08:51:12 crc kubenswrapper[4810]: I1003 08:51:12.877058 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lmhgp" Oct 03 08:51:13 crc kubenswrapper[4810]: E1003 08:51:13.045411 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66eb28ac_c56f_4700_9faa_85e3327c303b.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:51:16 crc kubenswrapper[4810]: I1003 08:51:16.830750 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.194200 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b067-account-create-fcrcx"] Oct 03 08:51:19 crc kubenswrapper[4810]: E1003 08:51:19.195342 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66eb28ac-c56f-4700-9faa-85e3327c303b" containerName="mariadb-database-create" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.195365 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="66eb28ac-c56f-4700-9faa-85e3327c303b" containerName="mariadb-database-create" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.195650 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="66eb28ac-c56f-4700-9faa-85e3327c303b" containerName="mariadb-database-create" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.196742 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.200650 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b067-account-create-fcrcx"] Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.200666 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.357683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jvg8\" (UniqueName: \"kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8\") pod \"keystone-b067-account-create-fcrcx\" (UID: \"7d008017-c649-4878-b691-07cbc101901f\") " pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.459952 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jvg8\" (UniqueName: \"kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8\") pod \"keystone-b067-account-create-fcrcx\" (UID: \"7d008017-c649-4878-b691-07cbc101901f\") " pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.483758 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jvg8\" (UniqueName: \"kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8\") pod \"keystone-b067-account-create-fcrcx\" (UID: \"7d008017-c649-4878-b691-07cbc101901f\") " pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:19 crc kubenswrapper[4810]: I1003 08:51:19.525351 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:20 crc kubenswrapper[4810]: I1003 08:51:20.029261 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b067-account-create-fcrcx"] Oct 03 08:51:20 crc kubenswrapper[4810]: I1003 08:51:20.958540 4810 generic.go:334] "Generic (PLEG): container finished" podID="7d008017-c649-4878-b691-07cbc101901f" containerID="6ad3d0a696c4ff3e86eadaee3703451b858cce5adeab38849fbc64554d7ae74f" exitCode=0 Oct 03 08:51:20 crc kubenswrapper[4810]: I1003 08:51:20.958661 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b067-account-create-fcrcx" event={"ID":"7d008017-c649-4878-b691-07cbc101901f","Type":"ContainerDied","Data":"6ad3d0a696c4ff3e86eadaee3703451b858cce5adeab38849fbc64554d7ae74f"} Oct 03 08:51:20 crc kubenswrapper[4810]: I1003 08:51:20.959230 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b067-account-create-fcrcx" event={"ID":"7d008017-c649-4878-b691-07cbc101901f","Type":"ContainerStarted","Data":"823740bdbf5b9d6d22c3a431aedd7925d499054a2a515f588def4c45e1cc01d3"} Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.319680 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.511199 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jvg8\" (UniqueName: \"kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8\") pod \"7d008017-c649-4878-b691-07cbc101901f\" (UID: \"7d008017-c649-4878-b691-07cbc101901f\") " Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.518862 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8" (OuterVolumeSpecName: "kube-api-access-9jvg8") pod "7d008017-c649-4878-b691-07cbc101901f" (UID: "7d008017-c649-4878-b691-07cbc101901f"). InnerVolumeSpecName "kube-api-access-9jvg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.613358 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jvg8\" (UniqueName: \"kubernetes.io/projected/7d008017-c649-4878-b691-07cbc101901f-kube-api-access-9jvg8\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.994966 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b067-account-create-fcrcx" event={"ID":"7d008017-c649-4878-b691-07cbc101901f","Type":"ContainerDied","Data":"823740bdbf5b9d6d22c3a431aedd7925d499054a2a515f588def4c45e1cc01d3"} Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.995018 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="823740bdbf5b9d6d22c3a431aedd7925d499054a2a515f588def4c45e1cc01d3" Oct 03 08:51:22 crc kubenswrapper[4810]: I1003 08:51:22.995020 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b067-account-create-fcrcx" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.751965 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-shp2j"] Oct 03 08:51:24 crc kubenswrapper[4810]: E1003 08:51:24.752692 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d008017-c649-4878-b691-07cbc101901f" containerName="mariadb-account-create" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.752711 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d008017-c649-4878-b691-07cbc101901f" containerName="mariadb-account-create" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.752995 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d008017-c649-4878-b691-07cbc101901f" containerName="mariadb-account-create" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.753636 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.755695 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.755811 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.756086 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zs8sq" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.756153 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.770896 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-shp2j"] Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.953539 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wdb\" (UniqueName: \"kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.953665 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:24 crc kubenswrapper[4810]: I1003 08:51:24.953756 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.055587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wdb\" (UniqueName: \"kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.056220 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.056353 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.061990 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.066025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.074203 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wdb\" (UniqueName: \"kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb\") pod \"keystone-db-sync-shp2j\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.080452 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:25 crc kubenswrapper[4810]: I1003 08:51:25.535192 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-shp2j"] Oct 03 08:51:26 crc kubenswrapper[4810]: I1003 08:51:26.021043 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-shp2j" event={"ID":"9235a01f-b369-484b-bc48-fa523d863294","Type":"ContainerStarted","Data":"481608dcdd9b01f215d680f4451357ba5c536ea2eaf46baf352e9de0b191b913"} Oct 03 08:51:31 crc kubenswrapper[4810]: I1003 08:51:31.073120 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-shp2j" event={"ID":"9235a01f-b369-484b-bc48-fa523d863294","Type":"ContainerStarted","Data":"61f214431d6eb5882a71d4fd19c306f5db9cc8561556c04737bf1d37c0c93c80"} Oct 03 08:51:31 crc kubenswrapper[4810]: I1003 08:51:31.093571 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-shp2j" podStartSLOduration=2.160273175 podStartE2EDuration="7.093549662s" podCreationTimestamp="2025-10-03 08:51:24 +0000 UTC" firstStartedPulling="2025-10-03 08:51:25.541079974 +0000 UTC m=+6918.968330719" lastFinishedPulling="2025-10-03 08:51:30.474356471 +0000 UTC m=+6923.901607206" observedRunningTime="2025-10-03 08:51:31.089743741 +0000 UTC m=+6924.516994496" watchObservedRunningTime="2025-10-03 08:51:31.093549662 +0000 UTC m=+6924.520800397" Oct 03 08:51:33 crc kubenswrapper[4810]: I1003 08:51:33.089059 4810 generic.go:334] "Generic (PLEG): container finished" podID="9235a01f-b369-484b-bc48-fa523d863294" containerID="61f214431d6eb5882a71d4fd19c306f5db9cc8561556c04737bf1d37c0c93c80" exitCode=0 Oct 03 08:51:33 crc kubenswrapper[4810]: I1003 08:51:33.089114 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-shp2j" event={"ID":"9235a01f-b369-484b-bc48-fa523d863294","Type":"ContainerDied","Data":"61f214431d6eb5882a71d4fd19c306f5db9cc8561556c04737bf1d37c0c93c80"} Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.429311 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.542437 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle\") pod \"9235a01f-b369-484b-bc48-fa523d863294\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.542519 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98wdb\" (UniqueName: \"kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb\") pod \"9235a01f-b369-484b-bc48-fa523d863294\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.543429 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data\") pod \"9235a01f-b369-484b-bc48-fa523d863294\" (UID: \"9235a01f-b369-484b-bc48-fa523d863294\") " Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.547833 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb" (OuterVolumeSpecName: "kube-api-access-98wdb") pod "9235a01f-b369-484b-bc48-fa523d863294" (UID: "9235a01f-b369-484b-bc48-fa523d863294"). InnerVolumeSpecName "kube-api-access-98wdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.569705 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9235a01f-b369-484b-bc48-fa523d863294" (UID: "9235a01f-b369-484b-bc48-fa523d863294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.585053 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data" (OuterVolumeSpecName: "config-data") pod "9235a01f-b369-484b-bc48-fa523d863294" (UID: "9235a01f-b369-484b-bc48-fa523d863294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.646085 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.646126 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98wdb\" (UniqueName: \"kubernetes.io/projected/9235a01f-b369-484b-bc48-fa523d863294-kube-api-access-98wdb\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:34 crc kubenswrapper[4810]: I1003 08:51:34.646140 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9235a01f-b369-484b-bc48-fa523d863294-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.108435 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-shp2j" event={"ID":"9235a01f-b369-484b-bc48-fa523d863294","Type":"ContainerDied","Data":"481608dcdd9b01f215d680f4451357ba5c536ea2eaf46baf352e9de0b191b913"} Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.108507 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="481608dcdd9b01f215d680f4451357ba5c536ea2eaf46baf352e9de0b191b913" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.108604 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-shp2j" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.357056 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:51:35 crc kubenswrapper[4810]: E1003 08:51:35.359430 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9235a01f-b369-484b-bc48-fa523d863294" containerName="keystone-db-sync" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.359461 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9235a01f-b369-484b-bc48-fa523d863294" containerName="keystone-db-sync" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.359677 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9235a01f-b369-484b-bc48-fa523d863294" containerName="keystone-db-sync" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.360646 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.391683 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.398738 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hf7bk"] Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.400150 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.402512 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.402689 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.402826 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.404952 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zs8sq" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.421401 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hf7bk"] Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.463064 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.463131 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.463166 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.463253 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.464004 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.464360 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.464588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdzxb\" (UniqueName: \"kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.464805 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.465237 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.465313 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqhtv\" (UniqueName: \"kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.465380 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.566812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.566938 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.566977 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqhtv\" (UniqueName: \"kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567010 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567064 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567087 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567163 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567189 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567218 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.567257 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdzxb\" (UniqueName: \"kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.568229 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.568549 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.569082 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.570849 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.571460 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.575943 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.575974 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.576019 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.576648 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.584969 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdzxb\" (UniqueName: \"kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb\") pod \"dnsmasq-dns-b465b4b5-nlkqs\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.600616 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqhtv\" (UniqueName: \"kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv\") pod \"keystone-bootstrap-hf7bk\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.680000 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:35 crc kubenswrapper[4810]: I1003 08:51:35.720883 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:36 crc kubenswrapper[4810]: I1003 08:51:36.164796 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:51:36 crc kubenswrapper[4810]: W1003 08:51:36.167532 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b4e081_45c1_4d63_8db7_b0f7c1555643.slice/crio-87dc8ab945c91a3bad59b58bf362f3e7aa5d2158a5b0f5e1db9de8e0bdec0b85 WatchSource:0}: Error finding container 87dc8ab945c91a3bad59b58bf362f3e7aa5d2158a5b0f5e1db9de8e0bdec0b85: Status 404 returned error can't find the container with id 87dc8ab945c91a3bad59b58bf362f3e7aa5d2158a5b0f5e1db9de8e0bdec0b85 Oct 03 08:51:36 crc kubenswrapper[4810]: W1003 08:51:36.230291 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b07f7a_4fc0_49f8_8e1b_0c8c4d25f55c.slice/crio-01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869 WatchSource:0}: Error finding container 01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869: Status 404 returned error can't find the container with id 01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869 Oct 03 08:51:36 crc kubenswrapper[4810]: I1003 08:51:36.231741 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hf7bk"] Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.126793 4810 generic.go:334] "Generic (PLEG): container finished" podID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerID="952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37" exitCode=0 Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.126878 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" event={"ID":"85b4e081-45c1-4d63-8db7-b0f7c1555643","Type":"ContainerDied","Data":"952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37"} Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.127250 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" event={"ID":"85b4e081-45c1-4d63-8db7-b0f7c1555643","Type":"ContainerStarted","Data":"87dc8ab945c91a3bad59b58bf362f3e7aa5d2158a5b0f5e1db9de8e0bdec0b85"} Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.129044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hf7bk" event={"ID":"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c","Type":"ContainerStarted","Data":"65c477036e034e9595a92ff85fdbb05324349f540a9eb6d7c19dabbb0a8f9653"} Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.129086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hf7bk" event={"ID":"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c","Type":"ContainerStarted","Data":"01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869"} Oct 03 08:51:37 crc kubenswrapper[4810]: I1003 08:51:37.195755 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hf7bk" podStartSLOduration=2.195735382 podStartE2EDuration="2.195735382s" podCreationTimestamp="2025-10-03 08:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:37.190571154 +0000 UTC m=+6930.617821889" watchObservedRunningTime="2025-10-03 08:51:37.195735382 +0000 UTC m=+6930.622986117" Oct 03 08:51:38 crc kubenswrapper[4810]: I1003 08:51:38.139791 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" event={"ID":"85b4e081-45c1-4d63-8db7-b0f7c1555643","Type":"ContainerStarted","Data":"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec"} Oct 03 08:51:39 crc kubenswrapper[4810]: I1003 08:51:39.148134 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:41 crc kubenswrapper[4810]: I1003 08:51:41.168713 4810 generic.go:334] "Generic (PLEG): container finished" podID="14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" containerID="65c477036e034e9595a92ff85fdbb05324349f540a9eb6d7c19dabbb0a8f9653" exitCode=0 Oct 03 08:51:41 crc kubenswrapper[4810]: I1003 08:51:41.168840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hf7bk" event={"ID":"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c","Type":"ContainerDied","Data":"65c477036e034e9595a92ff85fdbb05324349f540a9eb6d7c19dabbb0a8f9653"} Oct 03 08:51:41 crc kubenswrapper[4810]: I1003 08:51:41.191564 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" podStartSLOduration=6.191539327 podStartE2EDuration="6.191539327s" podCreationTimestamp="2025-10-03 08:51:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:38.163240024 +0000 UTC m=+6931.590490759" watchObservedRunningTime="2025-10-03 08:51:41.191539327 +0000 UTC m=+6934.618790062" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.538136 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.597958 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.598050 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.598140 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.598226 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqhtv\" (UniqueName: \"kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.598321 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.598348 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts\") pod \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\" (UID: \"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c\") " Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.605355 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.606227 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.606352 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts" (OuterVolumeSpecName: "scripts") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.607211 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv" (OuterVolumeSpecName: "kube-api-access-gqhtv") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "kube-api-access-gqhtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.630269 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data" (OuterVolumeSpecName: "config-data") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.632605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" (UID: "14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700173 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700210 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700221 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700235 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqhtv\" (UniqueName: \"kubernetes.io/projected/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-kube-api-access-gqhtv\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700247 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4810]: I1003 08:51:42.700256 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.191620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hf7bk" event={"ID":"14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c","Type":"ContainerDied","Data":"01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869"} Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.191674 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01e3ee0beb9741a62732c05e2da0154ad0dd89fcd2326b68e00112d391a37869" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.191697 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hf7bk" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.275656 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hf7bk"] Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.282385 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hf7bk"] Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.315877 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" path="/var/lib/kubelet/pods/14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c/volumes" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.375495 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kgg9h"] Oct 03 08:51:43 crc kubenswrapper[4810]: E1003 08:51:43.375851 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" containerName="keystone-bootstrap" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.375865 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" containerName="keystone-bootstrap" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.376102 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b07f7a-4fc0-49f8-8e1b-0c8c4d25f55c" containerName="keystone-bootstrap" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.376657 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.379494 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.379842 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.380157 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zs8sq" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.380198 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.393312 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kgg9h"] Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.417942 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.418286 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.418409 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.418485 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.418608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.418735 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j86h\" (UniqueName: \"kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520381 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520408 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520434 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520506 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.520554 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j86h\" (UniqueName: \"kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.525472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.525515 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.525795 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.530115 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.538628 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j86h\" (UniqueName: \"kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.538617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data\") pod \"keystone-bootstrap-kgg9h\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:43 crc kubenswrapper[4810]: I1003 08:51:43.703960 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:44 crc kubenswrapper[4810]: I1003 08:51:44.182757 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kgg9h"] Oct 03 08:51:44 crc kubenswrapper[4810]: I1003 08:51:44.205885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kgg9h" event={"ID":"149a19cb-62bc-4739-a90d-7366c2147b1d","Type":"ContainerStarted","Data":"3d2d89590ca7bda02250441f05cbfb89b5ceb62827906ab71fcbbe4c81543e78"} Oct 03 08:51:45 crc kubenswrapper[4810]: I1003 08:51:45.213548 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kgg9h" event={"ID":"149a19cb-62bc-4739-a90d-7366c2147b1d","Type":"ContainerStarted","Data":"456df9582fd0dabe43d26d0d1c99eaad082ba03a1a3d0ee1753963d93927930c"} Oct 03 08:51:45 crc kubenswrapper[4810]: I1003 08:51:45.682252 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:51:45 crc kubenswrapper[4810]: I1003 08:51:45.712138 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kgg9h" podStartSLOduration=2.71211646 podStartE2EDuration="2.71211646s" podCreationTimestamp="2025-10-03 08:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:45.239514571 +0000 UTC m=+6938.666765306" watchObservedRunningTime="2025-10-03 08:51:45.71211646 +0000 UTC m=+6939.139367195" Oct 03 08:51:45 crc kubenswrapper[4810]: I1003 08:51:45.743882 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:51:45 crc kubenswrapper[4810]: I1003 08:51:45.744299 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="dnsmasq-dns" containerID="cri-o://b33f071bdda3c521a02cc348fe4ec31841c70429a49d1c240f3effae63cc45e7" gracePeriod=10 Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.231132 4810 generic.go:334] "Generic (PLEG): container finished" podID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerID="b33f071bdda3c521a02cc348fe4ec31841c70429a49d1c240f3effae63cc45e7" exitCode=0 Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.232058 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" event={"ID":"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1","Type":"ContainerDied","Data":"b33f071bdda3c521a02cc348fe4ec31841c70429a49d1c240f3effae63cc45e7"} Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.350226 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.385809 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config\") pod \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.385870 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb\") pod \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.386002 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc\") pod \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.386067 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gg6g\" (UniqueName: \"kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g\") pod \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.386103 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb\") pod \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\" (UID: \"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1\") " Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.395192 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g" (OuterVolumeSpecName: "kube-api-access-7gg6g") pod "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" (UID: "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1"). InnerVolumeSpecName "kube-api-access-7gg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.437223 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" (UID: "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.446786 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" (UID: "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.447112 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" (UID: "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.448698 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config" (OuterVolumeSpecName: "config") pod "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" (UID: "d24b46eb-5ee6-4d6b-b303-c75de6f1aef1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.488266 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.488314 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gg6g\" (UniqueName: \"kubernetes.io/projected/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-kube-api-access-7gg6g\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.488331 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.488346 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:46 crc kubenswrapper[4810]: I1003 08:51:46.488357 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.245776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" event={"ID":"d24b46eb-5ee6-4d6b-b303-c75de6f1aef1","Type":"ContainerDied","Data":"7313e625c8ee0443d0d7e91bdef29227b7859e69a8a565dd4f34017717cd45e2"} Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.245877 4810 scope.go:117] "RemoveContainer" containerID="b33f071bdda3c521a02cc348fe4ec31841c70429a49d1c240f3effae63cc45e7" Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.246187 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-868956d5bc-hr8mz" Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.248030 4810 generic.go:334] "Generic (PLEG): container finished" podID="149a19cb-62bc-4739-a90d-7366c2147b1d" containerID="456df9582fd0dabe43d26d0d1c99eaad082ba03a1a3d0ee1753963d93927930c" exitCode=0 Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.248135 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kgg9h" event={"ID":"149a19cb-62bc-4739-a90d-7366c2147b1d","Type":"ContainerDied","Data":"456df9582fd0dabe43d26d0d1c99eaad082ba03a1a3d0ee1753963d93927930c"} Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.303368 4810 scope.go:117] "RemoveContainer" containerID="4b88520c3b3e0c7afd2c46fd692f07f64a2ae7a44511b94f59d104bbd98fc346" Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.317972 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:51:47 crc kubenswrapper[4810]: I1003 08:51:47.318025 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-868956d5bc-hr8mz"] Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.548982 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.628829 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j86h\" (UniqueName: \"kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.628979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.629031 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.629097 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.629142 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.629162 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data\") pod \"149a19cb-62bc-4739-a90d-7366c2147b1d\" (UID: \"149a19cb-62bc-4739-a90d-7366c2147b1d\") " Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.636254 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.636409 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts" (OuterVolumeSpecName: "scripts") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.636977 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h" (OuterVolumeSpecName: "kube-api-access-2j86h") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "kube-api-access-2j86h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.638044 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.656694 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.668579 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data" (OuterVolumeSpecName: "config-data") pod "149a19cb-62bc-4739-a90d-7366c2147b1d" (UID: "149a19cb-62bc-4739-a90d-7366c2147b1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732551 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732623 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732634 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732645 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732657 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j86h\" (UniqueName: \"kubernetes.io/projected/149a19cb-62bc-4739-a90d-7366c2147b1d-kube-api-access-2j86h\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:48 crc kubenswrapper[4810]: I1003 08:51:48.732673 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/149a19cb-62bc-4739-a90d-7366c2147b1d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.271687 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kgg9h" event={"ID":"149a19cb-62bc-4739-a90d-7366c2147b1d","Type":"ContainerDied","Data":"3d2d89590ca7bda02250441f05cbfb89b5ceb62827906ab71fcbbe4c81543e78"} Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.272026 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2d89590ca7bda02250441f05cbfb89b5ceb62827906ab71fcbbe4c81543e78" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.271813 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kgg9h" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.330033 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" path="/var/lib/kubelet/pods/d24b46eb-5ee6-4d6b-b303-c75de6f1aef1/volumes" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.373675 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 08:51:49 crc kubenswrapper[4810]: E1003 08:51:49.374343 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149a19cb-62bc-4739-a90d-7366c2147b1d" containerName="keystone-bootstrap" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.374433 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="149a19cb-62bc-4739-a90d-7366c2147b1d" containerName="keystone-bootstrap" Oct 03 08:51:49 crc kubenswrapper[4810]: E1003 08:51:49.374511 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="dnsmasq-dns" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.374590 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="dnsmasq-dns" Oct 03 08:51:49 crc kubenswrapper[4810]: E1003 08:51:49.374675 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="init" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.374777 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="init" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.375072 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="149a19cb-62bc-4739-a90d-7366c2147b1d" containerName="keystone-bootstrap" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.375186 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24b46eb-5ee6-4d6b-b303-c75de6f1aef1" containerName="dnsmasq-dns" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.375976 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.383355 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.383580 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.383581 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.383814 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zs8sq" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.386207 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.387247 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.387601 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.452591 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.452932 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453028 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453076 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453177 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453214 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453262 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvr8v\" (UniqueName: \"kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.453338 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.555209 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.555942 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556111 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556238 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556333 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556449 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvr8v\" (UniqueName: \"kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556584 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.556767 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.559701 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.560315 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.560312 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.560799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.561010 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.561332 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.561423 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.573395 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvr8v\" (UniqueName: \"kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v\") pod \"keystone-b9848c8cf-kb9qj\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:49 crc kubenswrapper[4810]: I1003 08:51:49.700657 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:50 crc kubenswrapper[4810]: I1003 08:51:50.175571 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 08:51:50 crc kubenswrapper[4810]: W1003 08:51:50.180464 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4be45a7_4886_405d_b10b_0519258715f6.slice/crio-9ea66bcfc2dff6a655069d90d0759d1578a75d48dfcdacd549e05651ed23cbe6 WatchSource:0}: Error finding container 9ea66bcfc2dff6a655069d90d0759d1578a75d48dfcdacd549e05651ed23cbe6: Status 404 returned error can't find the container with id 9ea66bcfc2dff6a655069d90d0759d1578a75d48dfcdacd549e05651ed23cbe6 Oct 03 08:51:50 crc kubenswrapper[4810]: I1003 08:51:50.283914 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9848c8cf-kb9qj" event={"ID":"e4be45a7-4886-405d-b10b-0519258715f6","Type":"ContainerStarted","Data":"9ea66bcfc2dff6a655069d90d0759d1578a75d48dfcdacd549e05651ed23cbe6"} Oct 03 08:51:51 crc kubenswrapper[4810]: I1003 08:51:51.296768 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9848c8cf-kb9qj" event={"ID":"e4be45a7-4886-405d-b10b-0519258715f6","Type":"ContainerStarted","Data":"3db6c0f0fe1196ece0d1fd23d0a431452da6842720d8625cef9b225452286fdb"} Oct 03 08:51:51 crc kubenswrapper[4810]: I1003 08:51:51.297119 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:51:51 crc kubenswrapper[4810]: I1003 08:51:51.322384 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b9848c8cf-kb9qj" podStartSLOduration=2.322355326 podStartE2EDuration="2.322355326s" podCreationTimestamp="2025-10-03 08:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:51.317578329 +0000 UTC m=+6944.744829064" watchObservedRunningTime="2025-10-03 08:51:51.322355326 +0000 UTC m=+6944.749606061" Oct 03 08:52:21 crc kubenswrapper[4810]: I1003 08:52:21.314159 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.277716 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.280570 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.283618 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.284384 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-x8qtx" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.287446 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.289386 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.380144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.380549 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.380588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k64lg\" (UniqueName: \"kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.380760 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.482495 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.482631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.482665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k64lg\" (UniqueName: \"kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.482744 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.483677 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.488392 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.488400 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.499510 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k64lg\" (UniqueName: \"kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg\") pod \"openstackclient\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " pod="openstack/openstackclient" Oct 03 08:52:24 crc kubenswrapper[4810]: I1003 08:52:24.615917 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:52:25 crc kubenswrapper[4810]: I1003 08:52:25.047400 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:52:25 crc kubenswrapper[4810]: I1003 08:52:25.598514 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e","Type":"ContainerStarted","Data":"0d1ff05a30b3d08b853b90787cfd9ed91156f625fa3dcdb7ac22aa1d1d89cdc3"} Oct 03 08:52:36 crc kubenswrapper[4810]: I1003 08:52:36.680803 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e","Type":"ContainerStarted","Data":"4a754a9416c76b5ff8b30d2267dfe35883388f775a4427683dcd949007287c65"} Oct 03 08:53:02 crc kubenswrapper[4810]: I1003 08:53:02.088315 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:53:02 crc kubenswrapper[4810]: I1003 08:53:02.089016 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:53:32 crc kubenswrapper[4810]: I1003 08:53:32.088587 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:53:32 crc kubenswrapper[4810]: I1003 08:53:32.089185 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.737482 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=82.291349049 podStartE2EDuration="1m32.73745975s" podCreationTimestamp="2025-10-03 08:52:24 +0000 UTC" firstStartedPulling="2025-10-03 08:52:25.055421916 +0000 UTC m=+6978.482672651" lastFinishedPulling="2025-10-03 08:52:35.501532617 +0000 UTC m=+6988.928783352" observedRunningTime="2025-10-03 08:52:36.699612706 +0000 UTC m=+6990.126863441" watchObservedRunningTime="2025-10-03 08:53:56.73745975 +0000 UTC m=+7070.164710485" Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.739060 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-b42pc"] Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.740047 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.752305 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b42pc"] Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.825277 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmjgm\" (UniqueName: \"kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm\") pod \"barbican-db-create-b42pc\" (UID: \"d3af9106-156d-42a4-a177-4a104952b1f0\") " pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.927520 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmjgm\" (UniqueName: \"kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm\") pod \"barbican-db-create-b42pc\" (UID: \"d3af9106-156d-42a4-a177-4a104952b1f0\") " pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:56 crc kubenswrapper[4810]: I1003 08:53:56.951714 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmjgm\" (UniqueName: \"kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm\") pod \"barbican-db-create-b42pc\" (UID: \"d3af9106-156d-42a4-a177-4a104952b1f0\") " pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:57 crc kubenswrapper[4810]: I1003 08:53:57.055585 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:57 crc kubenswrapper[4810]: I1003 08:53:57.471874 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-b42pc"] Oct 03 08:53:58 crc kubenswrapper[4810]: I1003 08:53:58.385601 4810 generic.go:334] "Generic (PLEG): container finished" podID="d3af9106-156d-42a4-a177-4a104952b1f0" containerID="b52673607d02a31be3ec875e91f22569829daf5bed9fb51893d9141467beef9b" exitCode=0 Oct 03 08:53:58 crc kubenswrapper[4810]: I1003 08:53:58.385654 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b42pc" event={"ID":"d3af9106-156d-42a4-a177-4a104952b1f0","Type":"ContainerDied","Data":"b52673607d02a31be3ec875e91f22569829daf5bed9fb51893d9141467beef9b"} Oct 03 08:53:58 crc kubenswrapper[4810]: I1003 08:53:58.385685 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b42pc" event={"ID":"d3af9106-156d-42a4-a177-4a104952b1f0","Type":"ContainerStarted","Data":"4110f71dd6d06eb9ad4ab3a7f104b641d8240e7734d9bde32723c66c0e20a90a"} Oct 03 08:53:59 crc kubenswrapper[4810]: I1003 08:53:59.691726 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b42pc" Oct 03 08:53:59 crc kubenswrapper[4810]: I1003 08:53:59.783079 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmjgm\" (UniqueName: \"kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm\") pod \"d3af9106-156d-42a4-a177-4a104952b1f0\" (UID: \"d3af9106-156d-42a4-a177-4a104952b1f0\") " Oct 03 08:53:59 crc kubenswrapper[4810]: I1003 08:53:59.788468 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm" (OuterVolumeSpecName: "kube-api-access-tmjgm") pod "d3af9106-156d-42a4-a177-4a104952b1f0" (UID: "d3af9106-156d-42a4-a177-4a104952b1f0"). InnerVolumeSpecName "kube-api-access-tmjgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:59 crc kubenswrapper[4810]: I1003 08:53:59.884813 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmjgm\" (UniqueName: \"kubernetes.io/projected/d3af9106-156d-42a4-a177-4a104952b1f0-kube-api-access-tmjgm\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:00 crc kubenswrapper[4810]: I1003 08:54:00.404587 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-b42pc" event={"ID":"d3af9106-156d-42a4-a177-4a104952b1f0","Type":"ContainerDied","Data":"4110f71dd6d06eb9ad4ab3a7f104b641d8240e7734d9bde32723c66c0e20a90a"} Oct 03 08:54:00 crc kubenswrapper[4810]: I1003 08:54:00.405020 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4110f71dd6d06eb9ad4ab3a7f104b641d8240e7734d9bde32723c66c0e20a90a" Oct 03 08:54:00 crc kubenswrapper[4810]: I1003 08:54:00.404659 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-b42pc" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.088662 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.088717 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.088756 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.089478 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.089822 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" gracePeriod=600 Oct 03 08:54:02 crc kubenswrapper[4810]: E1003 08:54:02.216100 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.427860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99"} Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.427952 4810 scope.go:117] "RemoveContainer" containerID="7c4f18e73ccffa63b0ce26c39504e8995a2ef8a00fba238db70db8be5b507bb1" Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.427810 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" exitCode=0 Oct 03 08:54:02 crc kubenswrapper[4810]: I1003 08:54:02.428498 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:54:02 crc kubenswrapper[4810]: E1003 08:54:02.428754 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.745408 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-6011-account-create-6zn2n"] Oct 03 08:54:06 crc kubenswrapper[4810]: E1003 08:54:06.746313 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3af9106-156d-42a4-a177-4a104952b1f0" containerName="mariadb-database-create" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.746327 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3af9106-156d-42a4-a177-4a104952b1f0" containerName="mariadb-database-create" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.746499 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3af9106-156d-42a4-a177-4a104952b1f0" containerName="mariadb-database-create" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.747151 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.749246 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.756863 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6011-account-create-6zn2n"] Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.805739 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8l48\" (UniqueName: \"kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48\") pod \"barbican-6011-account-create-6zn2n\" (UID: \"0c2896c9-43c9-439b-ac75-88a33a0928be\") " pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.907582 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8l48\" (UniqueName: \"kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48\") pod \"barbican-6011-account-create-6zn2n\" (UID: \"0c2896c9-43c9-439b-ac75-88a33a0928be\") " pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:06 crc kubenswrapper[4810]: I1003 08:54:06.929749 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8l48\" (UniqueName: \"kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48\") pod \"barbican-6011-account-create-6zn2n\" (UID: \"0c2896c9-43c9-439b-ac75-88a33a0928be\") " pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:07 crc kubenswrapper[4810]: I1003 08:54:07.073587 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:07 crc kubenswrapper[4810]: I1003 08:54:07.526958 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-6011-account-create-6zn2n"] Oct 03 08:54:07 crc kubenswrapper[4810]: I1003 08:54:07.538951 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 08:54:08 crc kubenswrapper[4810]: I1003 08:54:08.476266 4810 generic.go:334] "Generic (PLEG): container finished" podID="0c2896c9-43c9-439b-ac75-88a33a0928be" containerID="ae6412e881c90e54d9b1b29cff15387b52b4b7f40e6109b57f47fef1ee661338" exitCode=0 Oct 03 08:54:08 crc kubenswrapper[4810]: I1003 08:54:08.476349 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6011-account-create-6zn2n" event={"ID":"0c2896c9-43c9-439b-ac75-88a33a0928be","Type":"ContainerDied","Data":"ae6412e881c90e54d9b1b29cff15387b52b4b7f40e6109b57f47fef1ee661338"} Oct 03 08:54:08 crc kubenswrapper[4810]: I1003 08:54:08.476594 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6011-account-create-6zn2n" event={"ID":"0c2896c9-43c9-439b-ac75-88a33a0928be","Type":"ContainerStarted","Data":"6d66a30dcd3a2fef3896deb74a2bfef0a765655b5e41b3b684489a33f71dadef"} Oct 03 08:54:09 crc kubenswrapper[4810]: I1003 08:54:09.792554 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:09 crc kubenswrapper[4810]: I1003 08:54:09.858502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8l48\" (UniqueName: \"kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48\") pod \"0c2896c9-43c9-439b-ac75-88a33a0928be\" (UID: \"0c2896c9-43c9-439b-ac75-88a33a0928be\") " Oct 03 08:54:09 crc kubenswrapper[4810]: I1003 08:54:09.864147 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48" (OuterVolumeSpecName: "kube-api-access-v8l48") pod "0c2896c9-43c9-439b-ac75-88a33a0928be" (UID: "0c2896c9-43c9-439b-ac75-88a33a0928be"). InnerVolumeSpecName "kube-api-access-v8l48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:09 crc kubenswrapper[4810]: I1003 08:54:09.961038 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8l48\" (UniqueName: \"kubernetes.io/projected/0c2896c9-43c9-439b-ac75-88a33a0928be-kube-api-access-v8l48\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:10 crc kubenswrapper[4810]: I1003 08:54:10.494764 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-6011-account-create-6zn2n" event={"ID":"0c2896c9-43c9-439b-ac75-88a33a0928be","Type":"ContainerDied","Data":"6d66a30dcd3a2fef3896deb74a2bfef0a765655b5e41b3b684489a33f71dadef"} Oct 03 08:54:10 crc kubenswrapper[4810]: I1003 08:54:10.495190 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d66a30dcd3a2fef3896deb74a2bfef0a765655b5e41b3b684489a33f71dadef" Oct 03 08:54:10 crc kubenswrapper[4810]: I1003 08:54:10.494809 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-6011-account-create-6zn2n" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.963827 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-96fxf"] Oct 03 08:54:11 crc kubenswrapper[4810]: E1003 08:54:11.964787 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2896c9-43c9-439b-ac75-88a33a0928be" containerName="mariadb-account-create" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.964810 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2896c9-43c9-439b-ac75-88a33a0928be" containerName="mariadb-account-create" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.965473 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2896c9-43c9-439b-ac75-88a33a0928be" containerName="mariadb-account-create" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.966629 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.969277 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hbngd" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.969635 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.971503 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-96fxf"] Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.994599 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lrt\" (UniqueName: \"kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.994834 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:11 crc kubenswrapper[4810]: I1003 08:54:11.994869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.096313 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lrt\" (UniqueName: \"kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.096391 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.096426 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.102281 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.105265 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.113637 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lrt\" (UniqueName: \"kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt\") pod \"barbican-db-sync-96fxf\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.296481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:12 crc kubenswrapper[4810]: I1003 08:54:12.743235 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-96fxf"] Oct 03 08:54:13 crc kubenswrapper[4810]: I1003 08:54:13.524390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96fxf" event={"ID":"24a4700d-8b62-43c3-9dee-6971f3c59399","Type":"ContainerStarted","Data":"d6c366adadb7d728aa13070b121b24f2d58e21be87d0413bce070c20f8af43a3"} Oct 03 08:54:17 crc kubenswrapper[4810]: I1003 08:54:17.312547 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:54:17 crc kubenswrapper[4810]: E1003 08:54:17.313291 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:17 crc kubenswrapper[4810]: I1003 08:54:17.562558 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96fxf" event={"ID":"24a4700d-8b62-43c3-9dee-6971f3c59399","Type":"ContainerStarted","Data":"5155a216c214c324bde39971c5a142849a558759dc4cd64c80fe40ac6bc1f59c"} Oct 03 08:54:17 crc kubenswrapper[4810]: I1003 08:54:17.589058 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-96fxf" podStartSLOduration=2.040275452 podStartE2EDuration="6.589039646s" podCreationTimestamp="2025-10-03 08:54:11 +0000 UTC" firstStartedPulling="2025-10-03 08:54:12.73849456 +0000 UTC m=+7086.165745305" lastFinishedPulling="2025-10-03 08:54:17.287258764 +0000 UTC m=+7090.714509499" observedRunningTime="2025-10-03 08:54:17.586284603 +0000 UTC m=+7091.013535358" watchObservedRunningTime="2025-10-03 08:54:17.589039646 +0000 UTC m=+7091.016290381" Oct 03 08:54:19 crc kubenswrapper[4810]: I1003 08:54:19.579815 4810 generic.go:334] "Generic (PLEG): container finished" podID="24a4700d-8b62-43c3-9dee-6971f3c59399" containerID="5155a216c214c324bde39971c5a142849a558759dc4cd64c80fe40ac6bc1f59c" exitCode=0 Oct 03 08:54:19 crc kubenswrapper[4810]: I1003 08:54:19.579911 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96fxf" event={"ID":"24a4700d-8b62-43c3-9dee-6971f3c59399","Type":"ContainerDied","Data":"5155a216c214c324bde39971c5a142849a558759dc4cd64c80fe40ac6bc1f59c"} Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.897405 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.985418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle\") pod \"24a4700d-8b62-43c3-9dee-6971f3c59399\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.985481 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lrt\" (UniqueName: \"kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt\") pod \"24a4700d-8b62-43c3-9dee-6971f3c59399\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.985524 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data\") pod \"24a4700d-8b62-43c3-9dee-6971f3c59399\" (UID: \"24a4700d-8b62-43c3-9dee-6971f3c59399\") " Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.992170 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "24a4700d-8b62-43c3-9dee-6971f3c59399" (UID: "24a4700d-8b62-43c3-9dee-6971f3c59399"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:54:20 crc kubenswrapper[4810]: I1003 08:54:20.992464 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt" (OuterVolumeSpecName: "kube-api-access-m2lrt") pod "24a4700d-8b62-43c3-9dee-6971f3c59399" (UID: "24a4700d-8b62-43c3-9dee-6971f3c59399"). InnerVolumeSpecName "kube-api-access-m2lrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.014375 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a4700d-8b62-43c3-9dee-6971f3c59399" (UID: "24a4700d-8b62-43c3-9dee-6971f3c59399"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.087746 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lrt\" (UniqueName: \"kubernetes.io/projected/24a4700d-8b62-43c3-9dee-6971f3c59399-kube-api-access-m2lrt\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.087784 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.087794 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a4700d-8b62-43c3-9dee-6971f3c59399-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.602530 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-96fxf" event={"ID":"24a4700d-8b62-43c3-9dee-6971f3c59399","Type":"ContainerDied","Data":"d6c366adadb7d728aa13070b121b24f2d58e21be87d0413bce070c20f8af43a3"} Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.602954 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c366adadb7d728aa13070b121b24f2d58e21be87d0413bce070c20f8af43a3" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.602637 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-96fxf" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.826727 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 08:54:21 crc kubenswrapper[4810]: E1003 08:54:21.827167 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a4700d-8b62-43c3-9dee-6971f3c59399" containerName="barbican-db-sync" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.827190 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a4700d-8b62-43c3-9dee-6971f3c59399" containerName="barbican-db-sync" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.827477 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a4700d-8b62-43c3-9dee-6971f3c59399" containerName="barbican-db-sync" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.829180 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.831181 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.831829 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.835264 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hbngd" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.844278 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.903887 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.904017 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcf2d\" (UniqueName: \"kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.904075 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.904102 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.904200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.954149 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.955534 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.961767 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 08:54:21 crc kubenswrapper[4810]: I1003 08:54:21.974508 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006584 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006688 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006719 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcf2d\" (UniqueName: \"kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006751 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7pb\" (UniqueName: \"kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006781 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006798 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006814 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006857 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.006888 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.007982 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.014568 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.020610 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.039089 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.057069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcf2d\" (UniqueName: \"kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d\") pod \"barbican-worker-7688d87799-z7jvj\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.075983 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.081204 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.108352 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.110872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.111006 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.111127 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.111166 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.111216 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7pb\" (UniqueName: \"kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.112932 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.120351 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.130695 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.145064 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7pb\" (UniqueName: \"kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.170783 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data\") pod \"barbican-keystone-listener-d546c7c6d-hsgpr\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.175435 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.213025 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.213091 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.213132 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.213163 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98d94\" (UniqueName: \"kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.213261 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.232980 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.234688 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.240204 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.248364 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.273379 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319758 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319808 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319831 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319908 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.319961 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.320009 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98d94\" (UniqueName: \"kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.320153 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q29cl\" (UniqueName: \"kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.320182 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.320249 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.321207 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.321221 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.322410 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.323357 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.351157 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98d94\" (UniqueName: \"kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94\") pod \"dnsmasq-dns-547d48b7f7-ttx72\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.421246 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q29cl\" (UniqueName: \"kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.421505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.421578 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.421604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.421642 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.423429 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.427583 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.428356 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.433517 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.441503 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q29cl\" (UniqueName: \"kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl\") pod \"barbican-api-7df9cdc9d-rjmsf\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.554820 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.576743 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.597057 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.623546 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerStarted","Data":"65d9fd647735f56cf5b3bb1035d79865533360e329c4b709f523d19429d03972"} Oct 03 08:54:22 crc kubenswrapper[4810]: I1003 08:54:22.692532 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.074239 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:54:23 crc kubenswrapper[4810]: W1003 08:54:23.108317 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9746ac25_a0c6_41c4_bf75_62f4f1ee51ce.slice/crio-cd5b227c8037429fed0aa1767bc66d917aa4bb6419673c7f1f54748d5b42b518 WatchSource:0}: Error finding container cd5b227c8037429fed0aa1767bc66d917aa4bb6419673c7f1f54748d5b42b518: Status 404 returned error can't find the container with id cd5b227c8037429fed0aa1767bc66d917aa4bb6419673c7f1f54748d5b42b518 Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.161054 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.634367 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerStarted","Data":"6a10989d37573ed5a34af23d4a9877ad85752935f1d2a8c43e77b5fd6b30a7a1"} Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.634719 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerStarted","Data":"22f4fd4ad6afb7e9c13c860dc711b96fac278d61a4ed0f7c1da12fa6aa744c70"} Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.636151 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerStarted","Data":"70243bb2239e34b21a066d21957c415544fa02aff12a63e6940a9612cae5d09d"} Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.638435 4810 generic.go:334] "Generic (PLEG): container finished" podID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerID="db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931" exitCode=0 Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.638504 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" event={"ID":"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce","Type":"ContainerDied","Data":"db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931"} Oct 03 08:54:23 crc kubenswrapper[4810]: I1003 08:54:23.638563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" event={"ID":"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce","Type":"ContainerStarted","Data":"cd5b227c8037429fed0aa1767bc66d917aa4bb6419673c7f1f54748d5b42b518"} Oct 03 08:54:24 crc kubenswrapper[4810]: I1003 08:54:24.653823 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerStarted","Data":"ba8f23b00e574ed38b94b464a2550358de6f47b13a732b76199ab43ac8a36415"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.036195 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.037545 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.046323 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.046519 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.057015 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199095 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199371 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199522 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzd5q\" (UniqueName: \"kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199638 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199694 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.199726 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301420 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301481 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzd5q\" (UniqueName: \"kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301504 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301530 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301553 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301686 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.301725 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.302344 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.311763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.327730 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.327816 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.328153 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.329548 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzd5q\" (UniqueName: \"kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.335871 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs\") pod \"barbican-api-6866556666-25fj4\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.435325 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.666623 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerStarted","Data":"e38e84bc6b6ee906a778b23d1848296550b447437e04a7e712e4ebf93b978f4a"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.672295 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerStarted","Data":"cdda3f94e72b2e0fc1363ad8dec85327b12a98239d858659e6c717c1fc77cf5f"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.672360 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerStarted","Data":"a9eb5c4e834ec18df22b9dc93a6f7d4c539136ebfda5ef3d4283bbbf3ada1234"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.676369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" event={"ID":"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce","Type":"ContainerStarted","Data":"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.676688 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.678739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerStarted","Data":"84264a86d2128e9e97f646f8c03edeb4970ee766c71e9bcef3a5893828e276ee"} Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.678865 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.678935 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.688741 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" podStartSLOduration=2.9674333600000002 podStartE2EDuration="4.688722536s" podCreationTimestamp="2025-10-03 08:54:21 +0000 UTC" firstStartedPulling="2025-10-03 08:54:22.563116102 +0000 UTC m=+7095.990366837" lastFinishedPulling="2025-10-03 08:54:24.284405278 +0000 UTC m=+7097.711656013" observedRunningTime="2025-10-03 08:54:25.680779673 +0000 UTC m=+7099.108030408" watchObservedRunningTime="2025-10-03 08:54:25.688722536 +0000 UTC m=+7099.115973271" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.709757 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7df9cdc9d-rjmsf" podStartSLOduration=3.709733467 podStartE2EDuration="3.709733467s" podCreationTimestamp="2025-10-03 08:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:54:25.698954929 +0000 UTC m=+7099.126205664" watchObservedRunningTime="2025-10-03 08:54:25.709733467 +0000 UTC m=+7099.136984212" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.737988 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" podStartSLOduration=3.737963291 podStartE2EDuration="3.737963291s" podCreationTimestamp="2025-10-03 08:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:54:25.727632925 +0000 UTC m=+7099.154883670" watchObservedRunningTime="2025-10-03 08:54:25.737963291 +0000 UTC m=+7099.165214036" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.756366 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7688d87799-z7jvj" podStartSLOduration=3.190053757 podStartE2EDuration="4.756342592s" podCreationTimestamp="2025-10-03 08:54:21 +0000 UTC" firstStartedPulling="2025-10-03 08:54:22.716665335 +0000 UTC m=+7096.143916070" lastFinishedPulling="2025-10-03 08:54:24.28295417 +0000 UTC m=+7097.710204905" observedRunningTime="2025-10-03 08:54:25.748872543 +0000 UTC m=+7099.176123288" watchObservedRunningTime="2025-10-03 08:54:25.756342592 +0000 UTC m=+7099.183593327" Oct 03 08:54:25 crc kubenswrapper[4810]: I1003 08:54:25.863930 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 08:54:25 crc kubenswrapper[4810]: W1003 08:54:25.866974 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36616a3d_b630_49fe_80a4_5e024f5f575f.slice/crio-c02443ef2ea493851553b7fde9dbf8dff0cc4ff927abad827210275f8ab476a6 WatchSource:0}: Error finding container c02443ef2ea493851553b7fde9dbf8dff0cc4ff927abad827210275f8ab476a6: Status 404 returned error can't find the container with id c02443ef2ea493851553b7fde9dbf8dff0cc4ff927abad827210275f8ab476a6 Oct 03 08:54:26 crc kubenswrapper[4810]: I1003 08:54:26.689703 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerStarted","Data":"bef0d553e5791a6b7a41534f43da71d6aedce548e058b8e26b6a986f52a3616f"} Oct 03 08:54:26 crc kubenswrapper[4810]: I1003 08:54:26.690007 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerStarted","Data":"68ad4f5b62f1bf62badf506a3cc48bf9e7ddd204e527495d5adf65f34b954e77"} Oct 03 08:54:26 crc kubenswrapper[4810]: I1003 08:54:26.690023 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerStarted","Data":"c02443ef2ea493851553b7fde9dbf8dff0cc4ff927abad827210275f8ab476a6"} Oct 03 08:54:26 crc kubenswrapper[4810]: I1003 08:54:26.716391 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6866556666-25fj4" podStartSLOduration=1.7163715800000001 podStartE2EDuration="1.71637158s" podCreationTimestamp="2025-10-03 08:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:54:26.715263931 +0000 UTC m=+7100.142514666" watchObservedRunningTime="2025-10-03 08:54:26.71637158 +0000 UTC m=+7100.143622325" Oct 03 08:54:27 crc kubenswrapper[4810]: I1003 08:54:27.697512 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:27 crc kubenswrapper[4810]: I1003 08:54:27.697884 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:32 crc kubenswrapper[4810]: I1003 08:54:32.302263 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:54:32 crc kubenswrapper[4810]: E1003 08:54:32.303077 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:32 crc kubenswrapper[4810]: I1003 08:54:32.580058 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:54:32 crc kubenswrapper[4810]: I1003 08:54:32.646653 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:54:32 crc kubenswrapper[4810]: I1003 08:54:32.647105 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="dnsmasq-dns" containerID="cri-o://fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec" gracePeriod=10 Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.204395 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.371265 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb\") pod \"85b4e081-45c1-4d63-8db7-b0f7c1555643\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.371305 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdzxb\" (UniqueName: \"kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb\") pod \"85b4e081-45c1-4d63-8db7-b0f7c1555643\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.371330 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc\") pod \"85b4e081-45c1-4d63-8db7-b0f7c1555643\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.371442 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config\") pod \"85b4e081-45c1-4d63-8db7-b0f7c1555643\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.371502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb\") pod \"85b4e081-45c1-4d63-8db7-b0f7c1555643\" (UID: \"85b4e081-45c1-4d63-8db7-b0f7c1555643\") " Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.378021 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb" (OuterVolumeSpecName: "kube-api-access-gdzxb") pod "85b4e081-45c1-4d63-8db7-b0f7c1555643" (UID: "85b4e081-45c1-4d63-8db7-b0f7c1555643"). InnerVolumeSpecName "kube-api-access-gdzxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.417748 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85b4e081-45c1-4d63-8db7-b0f7c1555643" (UID: "85b4e081-45c1-4d63-8db7-b0f7c1555643"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.417857 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config" (OuterVolumeSpecName: "config") pod "85b4e081-45c1-4d63-8db7-b0f7c1555643" (UID: "85b4e081-45c1-4d63-8db7-b0f7c1555643"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.418516 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "85b4e081-45c1-4d63-8db7-b0f7c1555643" (UID: "85b4e081-45c1-4d63-8db7-b0f7c1555643"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.434233 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "85b4e081-45c1-4d63-8db7-b0f7c1555643" (UID: "85b4e081-45c1-4d63-8db7-b0f7c1555643"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.474767 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.474817 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.474832 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdzxb\" (UniqueName: \"kubernetes.io/projected/85b4e081-45c1-4d63-8db7-b0f7c1555643-kube-api-access-gdzxb\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.474843 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.474854 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85b4e081-45c1-4d63-8db7-b0f7c1555643-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.765937 4810 generic.go:334] "Generic (PLEG): container finished" podID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerID="fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec" exitCode=0 Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.765987 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" event={"ID":"85b4e081-45c1-4d63-8db7-b0f7c1555643","Type":"ContainerDied","Data":"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec"} Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.766026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" event={"ID":"85b4e081-45c1-4d63-8db7-b0f7c1555643","Type":"ContainerDied","Data":"87dc8ab945c91a3bad59b58bf362f3e7aa5d2158a5b0f5e1db9de8e0bdec0b85"} Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.766052 4810 scope.go:117] "RemoveContainer" containerID="fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.766114 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b465b4b5-nlkqs" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.826157 4810 scope.go:117] "RemoveContainer" containerID="952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.839246 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.848448 4810 scope.go:117] "RemoveContainer" containerID="fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec" Oct 03 08:54:33 crc kubenswrapper[4810]: E1003 08:54:33.848963 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec\": container with ID starting with fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec not found: ID does not exist" containerID="fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.848997 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec"} err="failed to get container status \"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec\": rpc error: code = NotFound desc = could not find container \"fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec\": container with ID starting with fbc7d395fe50b081c1308b66220f8942a2bc79e399d8d6acd60b05aac1eb0cec not found: ID does not exist" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.849026 4810 scope.go:117] "RemoveContainer" containerID="952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.849039 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b465b4b5-nlkqs"] Oct 03 08:54:33 crc kubenswrapper[4810]: E1003 08:54:33.849376 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37\": container with ID starting with 952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37 not found: ID does not exist" containerID="952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37" Oct 03 08:54:33 crc kubenswrapper[4810]: I1003 08:54:33.849405 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37"} err="failed to get container status \"952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37\": rpc error: code = NotFound desc = could not find container \"952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37\": container with ID starting with 952e46aa7c793690f5ff18ed4394bc8f860781ba25b132068b7fbc0a387bcd37 not found: ID does not exist" Oct 03 08:54:34 crc kubenswrapper[4810]: I1003 08:54:34.155058 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:34 crc kubenswrapper[4810]: I1003 08:54:34.237068 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:35 crc kubenswrapper[4810]: I1003 08:54:35.313842 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" path="/var/lib/kubelet/pods/85b4e081-45c1-4d63-8db7-b0f7c1555643/volumes" Oct 03 08:54:36 crc kubenswrapper[4810]: I1003 08:54:36.855826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:36 crc kubenswrapper[4810]: I1003 08:54:36.927219 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 08:54:36 crc kubenswrapper[4810]: I1003 08:54:36.983998 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:36 crc kubenswrapper[4810]: I1003 08:54:36.984252 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7df9cdc9d-rjmsf" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api-log" containerID="cri-o://6a10989d37573ed5a34af23d4a9877ad85752935f1d2a8c43e77b5fd6b30a7a1" gracePeriod=30 Oct 03 08:54:36 crc kubenswrapper[4810]: I1003 08:54:36.984933 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7df9cdc9d-rjmsf" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api" containerID="cri-o://84264a86d2128e9e97f646f8c03edeb4970ee766c71e9bcef3a5893828e276ee" gracePeriod=30 Oct 03 08:54:37 crc kubenswrapper[4810]: I1003 08:54:37.804171 4810 generic.go:334] "Generic (PLEG): container finished" podID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerID="6a10989d37573ed5a34af23d4a9877ad85752935f1d2a8c43e77b5fd6b30a7a1" exitCode=143 Oct 03 08:54:37 crc kubenswrapper[4810]: I1003 08:54:37.804249 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerDied","Data":"6a10989d37573ed5a34af23d4a9877ad85752935f1d2a8c43e77b5fd6b30a7a1"} Oct 03 08:54:40 crc kubenswrapper[4810]: I1003 08:54:40.218261 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7df9cdc9d-rjmsf" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:49092->10.217.1.47:9311: read: connection reset by peer" Oct 03 08:54:40 crc kubenswrapper[4810]: I1003 08:54:40.218380 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7df9cdc9d-rjmsf" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.47:9311/healthcheck\": read tcp 10.217.0.2:49090->10.217.1.47:9311: read: connection reset by peer" Oct 03 08:54:40 crc kubenswrapper[4810]: I1003 08:54:40.830172 4810 generic.go:334] "Generic (PLEG): container finished" podID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerID="84264a86d2128e9e97f646f8c03edeb4970ee766c71e9bcef3a5893828e276ee" exitCode=0 Oct 03 08:54:40 crc kubenswrapper[4810]: I1003 08:54:40.830217 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerDied","Data":"84264a86d2128e9e97f646f8c03edeb4970ee766c71e9bcef3a5893828e276ee"} Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.147941 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.341455 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q29cl\" (UniqueName: \"kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl\") pod \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.341604 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle\") pod \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.341654 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data\") pod \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.341679 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom\") pod \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.341712 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs\") pod \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\" (UID: \"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53\") " Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.342687 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs" (OuterVolumeSpecName: "logs") pod "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" (UID: "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.347078 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl" (OuterVolumeSpecName: "kube-api-access-q29cl") pod "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" (UID: "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53"). InnerVolumeSpecName "kube-api-access-q29cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.347208 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" (UID: "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.366090 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" (UID: "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.399182 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data" (OuterVolumeSpecName: "config-data") pod "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" (UID: "2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.443509 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.443542 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.443552 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.443560 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.443569 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q29cl\" (UniqueName: \"kubernetes.io/projected/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53-kube-api-access-q29cl\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.839063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7df9cdc9d-rjmsf" event={"ID":"2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53","Type":"ContainerDied","Data":"22f4fd4ad6afb7e9c13c860dc711b96fac278d61a4ed0f7c1da12fa6aa744c70"} Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.839127 4810 scope.go:117] "RemoveContainer" containerID="84264a86d2128e9e97f646f8c03edeb4970ee766c71e9bcef3a5893828e276ee" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.839141 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7df9cdc9d-rjmsf" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.869685 4810 scope.go:117] "RemoveContainer" containerID="6a10989d37573ed5a34af23d4a9877ad85752935f1d2a8c43e77b5fd6b30a7a1" Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.871214 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:41 crc kubenswrapper[4810]: I1003 08:54:41.877150 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7df9cdc9d-rjmsf"] Oct 03 08:54:43 crc kubenswrapper[4810]: I1003 08:54:43.302732 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:54:43 crc kubenswrapper[4810]: E1003 08:54:43.303204 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:43 crc kubenswrapper[4810]: I1003 08:54:43.313978 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" path="/var/lib/kubelet/pods/2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53/volumes" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.304176 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:54:57 crc kubenswrapper[4810]: E1003 08:54:57.305128 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.936651 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:54:57 crc kubenswrapper[4810]: E1003 08:54:57.937065 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="init" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937083 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="init" Oct 03 08:54:57 crc kubenswrapper[4810]: E1003 08:54:57.937097 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937105 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api" Oct 03 08:54:57 crc kubenswrapper[4810]: E1003 08:54:57.937122 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="dnsmasq-dns" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937132 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="dnsmasq-dns" Oct 03 08:54:57 crc kubenswrapper[4810]: E1003 08:54:57.937152 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api-log" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937161 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api-log" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937349 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api-log" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937384 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e11e217-b8ab-48c3-9f2e-a9d1c7a73a53" containerName="barbican-api" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.937403 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b4e081-45c1-4d63-8db7-b0f7c1555643" containerName="dnsmasq-dns" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.938720 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:57 crc kubenswrapper[4810]: I1003 08:54:57.955863 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.064205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.064637 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.064683 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qppj6\" (UniqueName: \"kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.166068 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.166138 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qppj6\" (UniqueName: \"kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.166179 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.166716 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.166839 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.185217 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qppj6\" (UniqueName: \"kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6\") pod \"redhat-marketplace-wmmxz\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.262944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.586916 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-w8s7r"] Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.588644 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8s7r" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.593878 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8s7r"] Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.685041 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8zs\" (UniqueName: \"kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs\") pod \"neutron-db-create-w8s7r\" (UID: \"befb7b90-32fb-45a4-80e8-6e2540bd458a\") " pod="openstack/neutron-db-create-w8s7r" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.759319 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.787777 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8zs\" (UniqueName: \"kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs\") pod \"neutron-db-create-w8s7r\" (UID: \"befb7b90-32fb-45a4-80e8-6e2540bd458a\") " pod="openstack/neutron-db-create-w8s7r" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.808130 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8zs\" (UniqueName: \"kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs\") pod \"neutron-db-create-w8s7r\" (UID: \"befb7b90-32fb-45a4-80e8-6e2540bd458a\") " pod="openstack/neutron-db-create-w8s7r" Oct 03 08:54:58 crc kubenswrapper[4810]: I1003 08:54:58.920854 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8s7r" Oct 03 08:54:59 crc kubenswrapper[4810]: I1003 08:54:59.008779 4810 generic.go:334] "Generic (PLEG): container finished" podID="69683377-c543-48a5-b8bd-f565efc86973" containerID="639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653" exitCode=0 Oct 03 08:54:59 crc kubenswrapper[4810]: I1003 08:54:59.008828 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerDied","Data":"639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653"} Oct 03 08:54:59 crc kubenswrapper[4810]: I1003 08:54:59.008860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerStarted","Data":"a382d3d0aa64b535329f94530d4af19c61a16cf9be883f1a457a4548893d3260"} Oct 03 08:54:59 crc kubenswrapper[4810]: I1003 08:54:59.344716 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-w8s7r"] Oct 03 08:54:59 crc kubenswrapper[4810]: W1003 08:54:59.355008 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbefb7b90_32fb_45a4_80e8_6e2540bd458a.slice/crio-926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c WatchSource:0}: Error finding container 926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c: Status 404 returned error can't find the container with id 926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c Oct 03 08:55:00 crc kubenswrapper[4810]: I1003 08:55:00.023812 4810 generic.go:334] "Generic (PLEG): container finished" podID="befb7b90-32fb-45a4-80e8-6e2540bd458a" containerID="316345a114ef9ee66df79c88ba8023061810e416879575b9361e7e39a678b4ab" exitCode=0 Oct 03 08:55:00 crc kubenswrapper[4810]: I1003 08:55:00.023906 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8s7r" event={"ID":"befb7b90-32fb-45a4-80e8-6e2540bd458a","Type":"ContainerDied","Data":"316345a114ef9ee66df79c88ba8023061810e416879575b9361e7e39a678b4ab"} Oct 03 08:55:00 crc kubenswrapper[4810]: I1003 08:55:00.024385 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8s7r" event={"ID":"befb7b90-32fb-45a4-80e8-6e2540bd458a","Type":"ContainerStarted","Data":"926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c"} Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.038394 4810 generic.go:334] "Generic (PLEG): container finished" podID="69683377-c543-48a5-b8bd-f565efc86973" containerID="34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e" exitCode=0 Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.039093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerDied","Data":"34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e"} Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.380351 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8s7r" Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.535483 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl8zs\" (UniqueName: \"kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs\") pod \"befb7b90-32fb-45a4-80e8-6e2540bd458a\" (UID: \"befb7b90-32fb-45a4-80e8-6e2540bd458a\") " Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.545071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs" (OuterVolumeSpecName: "kube-api-access-wl8zs") pod "befb7b90-32fb-45a4-80e8-6e2540bd458a" (UID: "befb7b90-32fb-45a4-80e8-6e2540bd458a"). InnerVolumeSpecName "kube-api-access-wl8zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:01 crc kubenswrapper[4810]: I1003 08:55:01.637195 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl8zs\" (UniqueName: \"kubernetes.io/projected/befb7b90-32fb-45a4-80e8-6e2540bd458a-kube-api-access-wl8zs\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:02 crc kubenswrapper[4810]: I1003 08:55:02.048860 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-w8s7r" event={"ID":"befb7b90-32fb-45a4-80e8-6e2540bd458a","Type":"ContainerDied","Data":"926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c"} Oct 03 08:55:02 crc kubenswrapper[4810]: I1003 08:55:02.049181 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="926a2b9acb2b65ec784374978f001c63ee097580adc905c2be54bb48f35c275c" Oct 03 08:55:02 crc kubenswrapper[4810]: I1003 08:55:02.048930 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-w8s7r" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.061582 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerStarted","Data":"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af"} Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.088175 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wmmxz" podStartSLOduration=3.17222593 podStartE2EDuration="6.08815577s" podCreationTimestamp="2025-10-03 08:54:57 +0000 UTC" firstStartedPulling="2025-10-03 08:54:59.010673148 +0000 UTC m=+7132.437923883" lastFinishedPulling="2025-10-03 08:55:01.926602988 +0000 UTC m=+7135.353853723" observedRunningTime="2025-10-03 08:55:03.083169227 +0000 UTC m=+7136.510419962" watchObservedRunningTime="2025-10-03 08:55:03.08815577 +0000 UTC m=+7136.515406505" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.539565 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:03 crc kubenswrapper[4810]: E1003 08:55:03.540314 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="befb7b90-32fb-45a4-80e8-6e2540bd458a" containerName="mariadb-database-create" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.540333 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="befb7b90-32fb-45a4-80e8-6e2540bd458a" containerName="mariadb-database-create" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.540535 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="befb7b90-32fb-45a4-80e8-6e2540bd458a" containerName="mariadb-database-create" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.541825 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.549964 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.673391 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.673662 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5tf\" (UniqueName: \"kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.673802 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.775700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.776004 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5tf\" (UniqueName: \"kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.776146 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.776322 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.776687 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.804975 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5tf\" (UniqueName: \"kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf\") pod \"community-operators-5qzpj\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:03 crc kubenswrapper[4810]: I1003 08:55:03.865118 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:04 crc kubenswrapper[4810]: I1003 08:55:04.366566 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:05 crc kubenswrapper[4810]: I1003 08:55:05.076025 4810 generic.go:334] "Generic (PLEG): container finished" podID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerID="3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9" exitCode=0 Oct 03 08:55:05 crc kubenswrapper[4810]: I1003 08:55:05.076340 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerDied","Data":"3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9"} Oct 03 08:55:05 crc kubenswrapper[4810]: I1003 08:55:05.076365 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerStarted","Data":"089a2f0830f142d0513a1d5ba539577ee1645d4caa39a915b462a2b646870898"} Oct 03 08:55:07 crc kubenswrapper[4810]: I1003 08:55:07.093874 4810 generic.go:334] "Generic (PLEG): container finished" podID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerID="d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5" exitCode=0 Oct 03 08:55:07 crc kubenswrapper[4810]: I1003 08:55:07.093944 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerDied","Data":"d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5"} Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.107831 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerStarted","Data":"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546"} Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.128758 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5qzpj" podStartSLOduration=2.695360543 podStartE2EDuration="5.128737193s" podCreationTimestamp="2025-10-03 08:55:03 +0000 UTC" firstStartedPulling="2025-10-03 08:55:05.078035591 +0000 UTC m=+7138.505286326" lastFinishedPulling="2025-10-03 08:55:07.511412231 +0000 UTC m=+7140.938662976" observedRunningTime="2025-10-03 08:55:08.1271224 +0000 UTC m=+7141.554373135" watchObservedRunningTime="2025-10-03 08:55:08.128737193 +0000 UTC m=+7141.555987928" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.263071 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.264039 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.313273 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.818794 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f221-account-create-c4l9d"] Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.820067 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.821986 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.830419 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f221-account-create-c4l9d"] Oct 03 08:55:08 crc kubenswrapper[4810]: I1003 08:55:08.970833 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4mqs\" (UniqueName: \"kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs\") pod \"neutron-f221-account-create-c4l9d\" (UID: \"f42c70db-9189-44ad-8658-f6632d99565a\") " pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:09 crc kubenswrapper[4810]: I1003 08:55:09.073196 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4mqs\" (UniqueName: \"kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs\") pod \"neutron-f221-account-create-c4l9d\" (UID: \"f42c70db-9189-44ad-8658-f6632d99565a\") " pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:09 crc kubenswrapper[4810]: I1003 08:55:09.096260 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4mqs\" (UniqueName: \"kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs\") pod \"neutron-f221-account-create-c4l9d\" (UID: \"f42c70db-9189-44ad-8658-f6632d99565a\") " pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:09 crc kubenswrapper[4810]: I1003 08:55:09.141221 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:09 crc kubenswrapper[4810]: I1003 08:55:09.177849 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:09 crc kubenswrapper[4810]: I1003 08:55:09.570867 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f221-account-create-c4l9d"] Oct 03 08:55:10 crc kubenswrapper[4810]: I1003 08:55:10.125081 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f221-account-create-c4l9d" event={"ID":"f42c70db-9189-44ad-8658-f6632d99565a","Type":"ContainerStarted","Data":"649fb516ffcce0ea8f4cc74f8497a92141ca39c45305d503c5ed5a54b9287c1c"} Oct 03 08:55:10 crc kubenswrapper[4810]: I1003 08:55:10.125402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f221-account-create-c4l9d" event={"ID":"f42c70db-9189-44ad-8658-f6632d99565a","Type":"ContainerStarted","Data":"d0d3b43164e7ab602c9c4224d72a79aeb12c77e6882ce920a15c942eba9699af"} Oct 03 08:55:10 crc kubenswrapper[4810]: I1003 08:55:10.132816 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:55:10 crc kubenswrapper[4810]: I1003 08:55:10.139323 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f221-account-create-c4l9d" podStartSLOduration=2.139300767 podStartE2EDuration="2.139300767s" podCreationTimestamp="2025-10-03 08:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:10.138079244 +0000 UTC m=+7143.565329989" watchObservedRunningTime="2025-10-03 08:55:10.139300767 +0000 UTC m=+7143.566551502" Oct 03 08:55:11 crc kubenswrapper[4810]: I1003 08:55:11.134324 4810 generic.go:334] "Generic (PLEG): container finished" podID="f42c70db-9189-44ad-8658-f6632d99565a" containerID="649fb516ffcce0ea8f4cc74f8497a92141ca39c45305d503c5ed5a54b9287c1c" exitCode=0 Oct 03 08:55:11 crc kubenswrapper[4810]: I1003 08:55:11.134388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f221-account-create-c4l9d" event={"ID":"f42c70db-9189-44ad-8658-f6632d99565a","Type":"ContainerDied","Data":"649fb516ffcce0ea8f4cc74f8497a92141ca39c45305d503c5ed5a54b9287c1c"} Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.144991 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wmmxz" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="registry-server" containerID="cri-o://210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af" gracePeriod=2 Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.302996 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:55:12 crc kubenswrapper[4810]: E1003 08:55:12.303620 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.469030 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.642133 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4mqs\" (UniqueName: \"kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs\") pod \"f42c70db-9189-44ad-8658-f6632d99565a\" (UID: \"f42c70db-9189-44ad-8658-f6632d99565a\") " Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.648167 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs" (OuterVolumeSpecName: "kube-api-access-w4mqs") pod "f42c70db-9189-44ad-8658-f6632d99565a" (UID: "f42c70db-9189-44ad-8658-f6632d99565a"). InnerVolumeSpecName "kube-api-access-w4mqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:12 crc kubenswrapper[4810]: I1003 08:55:12.744421 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4mqs\" (UniqueName: \"kubernetes.io/projected/f42c70db-9189-44ad-8658-f6632d99565a-kube-api-access-w4mqs\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.101249 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.157504 4810 generic.go:334] "Generic (PLEG): container finished" podID="69683377-c543-48a5-b8bd-f565efc86973" containerID="210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af" exitCode=0 Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.157573 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmmxz" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.157592 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerDied","Data":"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af"} Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.158196 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmmxz" event={"ID":"69683377-c543-48a5-b8bd-f565efc86973","Type":"ContainerDied","Data":"a382d3d0aa64b535329f94530d4af19c61a16cf9be883f1a457a4548893d3260"} Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.158245 4810 scope.go:117] "RemoveContainer" containerID="210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.160321 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f221-account-create-c4l9d" event={"ID":"f42c70db-9189-44ad-8658-f6632d99565a","Type":"ContainerDied","Data":"d0d3b43164e7ab602c9c4224d72a79aeb12c77e6882ce920a15c942eba9699af"} Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.160374 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0d3b43164e7ab602c9c4224d72a79aeb12c77e6882ce920a15c942eba9699af" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.160443 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f221-account-create-c4l9d" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.189416 4810 scope.go:117] "RemoveContainer" containerID="34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.207649 4810 scope.go:117] "RemoveContainer" containerID="639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.226428 4810 scope.go:117] "RemoveContainer" containerID="210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af" Oct 03 08:55:13 crc kubenswrapper[4810]: E1003 08:55:13.227062 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af\": container with ID starting with 210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af not found: ID does not exist" containerID="210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.227096 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af"} err="failed to get container status \"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af\": rpc error: code = NotFound desc = could not find container \"210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af\": container with ID starting with 210d1bde39b9cb565d01b2e04acd8cf23e72d8bef1411a772a4b30a1d5ac02af not found: ID does not exist" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.227121 4810 scope.go:117] "RemoveContainer" containerID="34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e" Oct 03 08:55:13 crc kubenswrapper[4810]: E1003 08:55:13.227453 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e\": container with ID starting with 34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e not found: ID does not exist" containerID="34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.227483 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e"} err="failed to get container status \"34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e\": rpc error: code = NotFound desc = could not find container \"34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e\": container with ID starting with 34bfbf4ddf2c152a51277b48645f9626c3c372ccfe3253174435e380e4cdef5e not found: ID does not exist" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.227495 4810 scope.go:117] "RemoveContainer" containerID="639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653" Oct 03 08:55:13 crc kubenswrapper[4810]: E1003 08:55:13.227809 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653\": container with ID starting with 639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653 not found: ID does not exist" containerID="639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.227855 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653"} err="failed to get container status \"639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653\": rpc error: code = NotFound desc = could not find container \"639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653\": container with ID starting with 639d74052ffa6f8149999551a957d3381fdf5638c6c974a2cafbf0d8ef086653 not found: ID does not exist" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.252380 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities\") pod \"69683377-c543-48a5-b8bd-f565efc86973\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.252443 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content\") pod \"69683377-c543-48a5-b8bd-f565efc86973\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.252541 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qppj6\" (UniqueName: \"kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6\") pod \"69683377-c543-48a5-b8bd-f565efc86973\" (UID: \"69683377-c543-48a5-b8bd-f565efc86973\") " Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.253952 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities" (OuterVolumeSpecName: "utilities") pod "69683377-c543-48a5-b8bd-f565efc86973" (UID: "69683377-c543-48a5-b8bd-f565efc86973"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.254520 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.256825 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6" (OuterVolumeSpecName: "kube-api-access-qppj6") pod "69683377-c543-48a5-b8bd-f565efc86973" (UID: "69683377-c543-48a5-b8bd-f565efc86973"). InnerVolumeSpecName "kube-api-access-qppj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.268123 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69683377-c543-48a5-b8bd-f565efc86973" (UID: "69683377-c543-48a5-b8bd-f565efc86973"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.356284 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69683377-c543-48a5-b8bd-f565efc86973-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.356344 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qppj6\" (UniqueName: \"kubernetes.io/projected/69683377-c543-48a5-b8bd-f565efc86973-kube-api-access-qppj6\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.484194 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.493083 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmmxz"] Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.866486 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.866574 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:13 crc kubenswrapper[4810]: I1003 08:55:13.929464 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.038249 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fkzrg"] Oct 03 08:55:14 crc kubenswrapper[4810]: E1003 08:55:14.039105 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f42c70db-9189-44ad-8658-f6632d99565a" containerName="mariadb-account-create" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039128 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42c70db-9189-44ad-8658-f6632d99565a" containerName="mariadb-account-create" Oct 03 08:55:14 crc kubenswrapper[4810]: E1003 08:55:14.039144 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="extract-content" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039153 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="extract-content" Oct 03 08:55:14 crc kubenswrapper[4810]: E1003 08:55:14.039182 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="registry-server" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039191 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="registry-server" Oct 03 08:55:14 crc kubenswrapper[4810]: E1003 08:55:14.039206 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="extract-utilities" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039215 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="extract-utilities" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039398 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="69683377-c543-48a5-b8bd-f565efc86973" containerName="registry-server" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.039434 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f42c70db-9189-44ad-8658-f6632d99565a" containerName="mariadb-account-create" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.040083 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.043885 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.043900 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dpc9k" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.044489 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.049200 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fkzrg"] Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.170296 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xldv\" (UniqueName: \"kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.170781 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.170864 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.229525 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.272818 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.272867 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.272939 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xldv\" (UniqueName: \"kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.277369 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.278710 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.294247 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xldv\" (UniqueName: \"kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv\") pod \"neutron-db-sync-fkzrg\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.365257 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:14 crc kubenswrapper[4810]: I1003 08:55:14.833206 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fkzrg"] Oct 03 08:55:15 crc kubenswrapper[4810]: I1003 08:55:15.194227 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fkzrg" event={"ID":"8d233c1b-8472-419e-9d71-ab0cf11c2605","Type":"ContainerStarted","Data":"910c4507038139ef51066fd7772205b3f10067f87baedb27075b1d39019cb2fc"} Oct 03 08:55:15 crc kubenswrapper[4810]: I1003 08:55:15.194275 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fkzrg" event={"ID":"8d233c1b-8472-419e-9d71-ab0cf11c2605","Type":"ContainerStarted","Data":"ba60909e3e5d211e7c7cb825c007e7e498ddffa70e943ad3b4fb4521abb88e6f"} Oct 03 08:55:15 crc kubenswrapper[4810]: I1003 08:55:15.312591 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69683377-c543-48a5-b8bd-f565efc86973" path="/var/lib/kubelet/pods/69683377-c543-48a5-b8bd-f565efc86973/volumes" Oct 03 08:55:15 crc kubenswrapper[4810]: I1003 08:55:15.728360 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fkzrg" podStartSLOduration=1.728342023 podStartE2EDuration="1.728342023s" podCreationTimestamp="2025-10-03 08:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:15.214155325 +0000 UTC m=+7148.641406070" watchObservedRunningTime="2025-10-03 08:55:15.728342023 +0000 UTC m=+7149.155592758" Oct 03 08:55:15 crc kubenswrapper[4810]: I1003 08:55:15.732536 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.204528 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5qzpj" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="registry-server" containerID="cri-o://3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546" gracePeriod=2 Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.671242 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.818408 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content\") pod \"b50021e4-4615-454a-b65b-e22ed3debdb6\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.818507 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv5tf\" (UniqueName: \"kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf\") pod \"b50021e4-4615-454a-b65b-e22ed3debdb6\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.819778 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities\") pod \"b50021e4-4615-454a-b65b-e22ed3debdb6\" (UID: \"b50021e4-4615-454a-b65b-e22ed3debdb6\") " Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.820795 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities" (OuterVolumeSpecName: "utilities") pod "b50021e4-4615-454a-b65b-e22ed3debdb6" (UID: "b50021e4-4615-454a-b65b-e22ed3debdb6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.826676 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf" (OuterVolumeSpecName: "kube-api-access-zv5tf") pod "b50021e4-4615-454a-b65b-e22ed3debdb6" (UID: "b50021e4-4615-454a-b65b-e22ed3debdb6"). InnerVolumeSpecName "kube-api-access-zv5tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.861015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b50021e4-4615-454a-b65b-e22ed3debdb6" (UID: "b50021e4-4615-454a-b65b-e22ed3debdb6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.921852 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.921926 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv5tf\" (UniqueName: \"kubernetes.io/projected/b50021e4-4615-454a-b65b-e22ed3debdb6-kube-api-access-zv5tf\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:16 crc kubenswrapper[4810]: I1003 08:55:16.921936 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b50021e4-4615-454a-b65b-e22ed3debdb6-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.214935 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerDied","Data":"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546"} Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.214996 4810 scope.go:117] "RemoveContainer" containerID="3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.214947 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qzpj" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.214881 4810 generic.go:334] "Generic (PLEG): container finished" podID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerID="3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546" exitCode=0 Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.229243 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qzpj" event={"ID":"b50021e4-4615-454a-b65b-e22ed3debdb6","Type":"ContainerDied","Data":"089a2f0830f142d0513a1d5ba539577ee1645d4caa39a915b462a2b646870898"} Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.235922 4810 scope.go:117] "RemoveContainer" containerID="d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.262127 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.267563 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5qzpj"] Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.281580 4810 scope.go:117] "RemoveContainer" containerID="3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.302826 4810 scope.go:117] "RemoveContainer" containerID="3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546" Oct 03 08:55:17 crc kubenswrapper[4810]: E1003 08:55:17.303186 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546\": container with ID starting with 3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546 not found: ID does not exist" containerID="3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.303272 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546"} err="failed to get container status \"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546\": rpc error: code = NotFound desc = could not find container \"3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546\": container with ID starting with 3c04c6fe0c66145c5112767154e1f19af039d1fd9ae0ec28a4a43e77fadc7546 not found: ID does not exist" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.303340 4810 scope.go:117] "RemoveContainer" containerID="d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5" Oct 03 08:55:17 crc kubenswrapper[4810]: E1003 08:55:17.303894 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5\": container with ID starting with d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5 not found: ID does not exist" containerID="d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.304020 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5"} err="failed to get container status \"d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5\": rpc error: code = NotFound desc = could not find container \"d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5\": container with ID starting with d74303a6bb9b2797d64a0f762c802edaa7b340df6834adf1ce4d348a221036c5 not found: ID does not exist" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.304060 4810 scope.go:117] "RemoveContainer" containerID="3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9" Oct 03 08:55:17 crc kubenswrapper[4810]: E1003 08:55:17.304940 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9\": container with ID starting with 3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9 not found: ID does not exist" containerID="3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.304964 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9"} err="failed to get container status \"3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9\": rpc error: code = NotFound desc = could not find container \"3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9\": container with ID starting with 3026b2e44540b9f94eed815c21ad4ddd39c2d871628fe8a0c503bacb65e315a9 not found: ID does not exist" Oct 03 08:55:17 crc kubenswrapper[4810]: I1003 08:55:17.312499 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" path="/var/lib/kubelet/pods/b50021e4-4615-454a-b65b-e22ed3debdb6/volumes" Oct 03 08:55:19 crc kubenswrapper[4810]: I1003 08:55:19.247327 4810 generic.go:334] "Generic (PLEG): container finished" podID="8d233c1b-8472-419e-9d71-ab0cf11c2605" containerID="910c4507038139ef51066fd7772205b3f10067f87baedb27075b1d39019cb2fc" exitCode=0 Oct 03 08:55:19 crc kubenswrapper[4810]: I1003 08:55:19.247410 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fkzrg" event={"ID":"8d233c1b-8472-419e-9d71-ab0cf11c2605","Type":"ContainerDied","Data":"910c4507038139ef51066fd7772205b3f10067f87baedb27075b1d39019cb2fc"} Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.591255 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.690668 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xldv\" (UniqueName: \"kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv\") pod \"8d233c1b-8472-419e-9d71-ab0cf11c2605\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.690963 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config\") pod \"8d233c1b-8472-419e-9d71-ab0cf11c2605\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.692029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle\") pod \"8d233c1b-8472-419e-9d71-ab0cf11c2605\" (UID: \"8d233c1b-8472-419e-9d71-ab0cf11c2605\") " Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.697989 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv" (OuterVolumeSpecName: "kube-api-access-6xldv") pod "8d233c1b-8472-419e-9d71-ab0cf11c2605" (UID: "8d233c1b-8472-419e-9d71-ab0cf11c2605"). InnerVolumeSpecName "kube-api-access-6xldv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.733152 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config" (OuterVolumeSpecName: "config") pod "8d233c1b-8472-419e-9d71-ab0cf11c2605" (UID: "8d233c1b-8472-419e-9d71-ab0cf11c2605"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.734476 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d233c1b-8472-419e-9d71-ab0cf11c2605" (UID: "8d233c1b-8472-419e-9d71-ab0cf11c2605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.793723 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.793762 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d233c1b-8472-419e-9d71-ab0cf11c2605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:20 crc kubenswrapper[4810]: I1003 08:55:20.793779 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xldv\" (UniqueName: \"kubernetes.io/projected/8d233c1b-8472-419e-9d71-ab0cf11c2605-kube-api-access-6xldv\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.270159 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fkzrg" event={"ID":"8d233c1b-8472-419e-9d71-ab0cf11c2605","Type":"ContainerDied","Data":"ba60909e3e5d211e7c7cb825c007e7e498ddffa70e943ad3b4fb4521abb88e6f"} Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.270206 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba60909e3e5d211e7c7cb825c007e7e498ddffa70e943ad3b4fb4521abb88e6f" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.270262 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fkzrg" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402197 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:55:21 crc kubenswrapper[4810]: E1003 08:55:21.402532 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="extract-content" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402544 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="extract-content" Oct 03 08:55:21 crc kubenswrapper[4810]: E1003 08:55:21.402561 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="extract-utilities" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402568 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="extract-utilities" Oct 03 08:55:21 crc kubenswrapper[4810]: E1003 08:55:21.402584 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="registry-server" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402590 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="registry-server" Oct 03 08:55:21 crc kubenswrapper[4810]: E1003 08:55:21.402610 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d233c1b-8472-419e-9d71-ab0cf11c2605" containerName="neutron-db-sync" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402618 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d233c1b-8472-419e-9d71-ab0cf11c2605" containerName="neutron-db-sync" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402788 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d233c1b-8472-419e-9d71-ab0cf11c2605" containerName="neutron-db-sync" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.402801 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b50021e4-4615-454a-b65b-e22ed3debdb6" containerName="registry-server" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.403788 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.434288 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.508180 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb75z\" (UniqueName: \"kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.508265 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.508356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.508489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.508533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.574841 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.576771 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.581384 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.581435 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.581610 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-dpc9k" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.581710 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.589988 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.609706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.610180 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.610246 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.610271 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.610382 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb75z\" (UniqueName: \"kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.611595 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.612223 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.612667 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.615557 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.656052 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb75z\" (UniqueName: \"kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z\") pod \"dnsmasq-dns-6d9559fbd5-qh82b\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.711533 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.711585 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.711869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.712020 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkpr6\" (UniqueName: \"kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.712156 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.726342 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.814005 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.814080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkpr6\" (UniqueName: \"kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.814135 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.814188 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.814217 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.821841 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.822770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.830633 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.831109 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.850634 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkpr6\" (UniqueName: \"kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6\") pod \"neutron-dd55c49f8-dnnqz\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:21 crc kubenswrapper[4810]: I1003 08:55:21.903608 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:22 crc kubenswrapper[4810]: I1003 08:55:22.325539 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:55:22 crc kubenswrapper[4810]: I1003 08:55:22.501053 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:22 crc kubenswrapper[4810]: W1003 08:55:22.507730 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8b30b23_032e_47bd_bb1f_5f7b2c75d176.slice/crio-141df0cafcf9c39f41a360b739038d41971439454f0f1c98a59edf4d300bb467 WatchSource:0}: Error finding container 141df0cafcf9c39f41a360b739038d41971439454f0f1c98a59edf4d300bb467: Status 404 returned error can't find the container with id 141df0cafcf9c39f41a360b739038d41971439454f0f1c98a59edf4d300bb467 Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.309847 4810 generic.go:334] "Generic (PLEG): container finished" podID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerID="0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06" exitCode=0 Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322580 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" event={"ID":"86bef6b2-bea6-4405-a2e7-6b033769cbdb","Type":"ContainerDied","Data":"0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06"} Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322738 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" event={"ID":"86bef6b2-bea6-4405-a2e7-6b033769cbdb","Type":"ContainerStarted","Data":"0b2b9b6bb464e424fdd7a8ff2838c5be22c51ad2d73b6b10ba30f870d7665f34"} Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322760 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerStarted","Data":"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58"} Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322778 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerStarted","Data":"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9"} Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.322793 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerStarted","Data":"141df0cafcf9c39f41a360b739038d41971439454f0f1c98a59edf4d300bb467"} Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.366087 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dd55c49f8-dnnqz" podStartSLOduration=2.36604787 podStartE2EDuration="2.36604787s" podCreationTimestamp="2025-10-03 08:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:23.365310511 +0000 UTC m=+7156.792561266" watchObservedRunningTime="2025-10-03 08:55:23.36604787 +0000 UTC m=+7156.793298605" Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.921333 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.923426 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.925306 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.925598 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 08:55:23 crc kubenswrapper[4810]: I1003 08:55:23.941846 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067050 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067104 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067121 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067197 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqv7\" (UniqueName: \"kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067217 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.067247 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.169147 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.169464 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqv7\" (UniqueName: \"kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.169588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.169776 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.169961 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.170081 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.170153 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.175423 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.178313 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.178495 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.182511 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.183212 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.185585 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.196346 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqv7\" (UniqueName: \"kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7\") pod \"neutron-b9499669-jd95s\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.241489 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.337716 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" event={"ID":"86bef6b2-bea6-4405-a2e7-6b033769cbdb","Type":"ContainerStarted","Data":"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087"} Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.337765 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.365180 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" podStartSLOduration=3.365160093 podStartE2EDuration="3.365160093s" podCreationTimestamp="2025-10-03 08:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:24.362774749 +0000 UTC m=+7157.790025474" watchObservedRunningTime="2025-10-03 08:55:24.365160093 +0000 UTC m=+7157.792410828" Oct 03 08:55:24 crc kubenswrapper[4810]: I1003 08:55:24.801623 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 08:55:25 crc kubenswrapper[4810]: I1003 08:55:25.348065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerStarted","Data":"7f5bf483256e9e40c501144201c3dc96463a540abcc9d8379933be6e0f86f926"} Oct 03 08:55:25 crc kubenswrapper[4810]: I1003 08:55:25.349203 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerStarted","Data":"b96ac21f6b885aebfbc17e90eb2244817f36b2064b8dcd70ce7c7d24cb87aa0f"} Oct 03 08:55:25 crc kubenswrapper[4810]: I1003 08:55:25.349251 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerStarted","Data":"410fa5f41d72d766d3ada43a743ac5d2d88f201f43172b3856f15637e8ce18cd"} Oct 03 08:55:25 crc kubenswrapper[4810]: I1003 08:55:25.377676 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b9499669-jd95s" podStartSLOduration=2.3776536520000002 podStartE2EDuration="2.377653652s" podCreationTimestamp="2025-10-03 08:55:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:25.372002441 +0000 UTC m=+7158.799253176" watchObservedRunningTime="2025-10-03 08:55:25.377653652 +0000 UTC m=+7158.804904387" Oct 03 08:55:26 crc kubenswrapper[4810]: I1003 08:55:26.356559 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:27 crc kubenswrapper[4810]: I1003 08:55:27.310360 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:55:27 crc kubenswrapper[4810]: E1003 08:55:27.310826 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:55:31 crc kubenswrapper[4810]: I1003 08:55:31.728142 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:55:31 crc kubenswrapper[4810]: I1003 08:55:31.780451 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:55:31 crc kubenswrapper[4810]: I1003 08:55:31.780708 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="dnsmasq-dns" containerID="cri-o://90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e" gracePeriod=10 Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.273510 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.410691 4810 generic.go:334] "Generic (PLEG): container finished" podID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerID="90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e" exitCode=0 Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.410745 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.410744 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" event={"ID":"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce","Type":"ContainerDied","Data":"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e"} Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.410879 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547d48b7f7-ttx72" event={"ID":"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce","Type":"ContainerDied","Data":"cd5b227c8037429fed0aa1767bc66d917aa4bb6419673c7f1f54748d5b42b518"} Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.410946 4810 scope.go:117] "RemoveContainer" containerID="90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.430915 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config\") pod \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.431187 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb\") pod \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.431248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98d94\" (UniqueName: \"kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94\") pod \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.431279 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc\") pod \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.431320 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb\") pod \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\" (UID: \"9746ac25-a0c6-41c4-bf75-62f4f1ee51ce\") " Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.434696 4810 scope.go:117] "RemoveContainer" containerID="db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.436974 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94" (OuterVolumeSpecName: "kube-api-access-98d94") pod "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" (UID: "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce"). InnerVolumeSpecName "kube-api-access-98d94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.478280 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" (UID: "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.484508 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" (UID: "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.489149 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" (UID: "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.491544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config" (OuterVolumeSpecName: "config") pod "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" (UID: "9746ac25-a0c6-41c4-bf75-62f4f1ee51ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.534854 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.534975 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98d94\" (UniqueName: \"kubernetes.io/projected/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-kube-api-access-98d94\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.534991 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.535002 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.535013 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.552088 4810 scope.go:117] "RemoveContainer" containerID="90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e" Oct 03 08:55:32 crc kubenswrapper[4810]: E1003 08:55:32.552512 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e\": container with ID starting with 90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e not found: ID does not exist" containerID="90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.552551 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e"} err="failed to get container status \"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e\": rpc error: code = NotFound desc = could not find container \"90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e\": container with ID starting with 90ae6c0503af519fc40fe4f2c32b623bf3344b7984cd891fcd2667e17679c38e not found: ID does not exist" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.552578 4810 scope.go:117] "RemoveContainer" containerID="db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931" Oct 03 08:55:32 crc kubenswrapper[4810]: E1003 08:55:32.552835 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931\": container with ID starting with db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931 not found: ID does not exist" containerID="db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.552863 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931"} err="failed to get container status \"db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931\": rpc error: code = NotFound desc = could not find container \"db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931\": container with ID starting with db87382e887c91369de383bfc700840aa0adab0d575d4c7e1623195fb2579931 not found: ID does not exist" Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.739957 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:55:32 crc kubenswrapper[4810]: I1003 08:55:32.748457 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547d48b7f7-ttx72"] Oct 03 08:55:33 crc kubenswrapper[4810]: I1003 08:55:33.318289 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" path="/var/lib/kubelet/pods/9746ac25-a0c6-41c4-bf75-62f4f1ee51ce/volumes" Oct 03 08:55:42 crc kubenswrapper[4810]: I1003 08:55:42.303103 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:55:42 crc kubenswrapper[4810]: E1003 08:55:42.304298 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:55:51 crc kubenswrapper[4810]: I1003 08:55:51.912948 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.255875 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b9499669-jd95s" Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.303197 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:55:54 crc kubenswrapper[4810]: E1003 08:55:54.303489 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.331401 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.332411 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dd55c49f8-dnnqz" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-api" containerID="cri-o://b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9" gracePeriod=30 Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.332653 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-dd55c49f8-dnnqz" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-httpd" containerID="cri-o://4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58" gracePeriod=30 Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.615148 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerID="4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58" exitCode=0 Oct 03 08:55:54 crc kubenswrapper[4810]: I1003 08:55:54.615187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerDied","Data":"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58"} Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.565435 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.651093 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerID="b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9" exitCode=0 Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.651489 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dd55c49f8-dnnqz" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.651507 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerDied","Data":"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9"} Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.651923 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dd55c49f8-dnnqz" event={"ID":"f8b30b23-032e-47bd-bb1f-5f7b2c75d176","Type":"ContainerDied","Data":"141df0cafcf9c39f41a360b739038d41971439454f0f1c98a59edf4d300bb467"} Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.651964 4810 scope.go:117] "RemoveContainer" containerID="4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.674485 4810 scope.go:117] "RemoveContainer" containerID="b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.692680 4810 scope.go:117] "RemoveContainer" containerID="4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58" Oct 03 08:55:58 crc kubenswrapper[4810]: E1003 08:55:58.693156 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58\": container with ID starting with 4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58 not found: ID does not exist" containerID="4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.693214 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58"} err="failed to get container status \"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58\": rpc error: code = NotFound desc = could not find container \"4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58\": container with ID starting with 4077c16a87c831fc42e64953008b63890955bb130da90aff2a3071be2c6dde58 not found: ID does not exist" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.693248 4810 scope.go:117] "RemoveContainer" containerID="b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9" Oct 03 08:55:58 crc kubenswrapper[4810]: E1003 08:55:58.693522 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9\": container with ID starting with b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9 not found: ID does not exist" containerID="b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.693813 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9"} err="failed to get container status \"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9\": rpc error: code = NotFound desc = could not find container \"b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9\": container with ID starting with b55c37d35e79a43c0e2324c842f2be7b325b1f8262e4318018da4de08b436ac9 not found: ID does not exist" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.724501 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs\") pod \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.724737 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle\") pod \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.724772 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config\") pod \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.724800 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config\") pod \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.724912 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkpr6\" (UniqueName: \"kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6\") pod \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\" (UID: \"f8b30b23-032e-47bd-bb1f-5f7b2c75d176\") " Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.736645 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6" (OuterVolumeSpecName: "kube-api-access-kkpr6") pod "f8b30b23-032e-47bd-bb1f-5f7b2c75d176" (UID: "f8b30b23-032e-47bd-bb1f-5f7b2c75d176"). InnerVolumeSpecName "kube-api-access-kkpr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.740362 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f8b30b23-032e-47bd-bb1f-5f7b2c75d176" (UID: "f8b30b23-032e-47bd-bb1f-5f7b2c75d176"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.782801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config" (OuterVolumeSpecName: "config") pod "f8b30b23-032e-47bd-bb1f-5f7b2c75d176" (UID: "f8b30b23-032e-47bd-bb1f-5f7b2c75d176"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.784852 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8b30b23-032e-47bd-bb1f-5f7b2c75d176" (UID: "f8b30b23-032e-47bd-bb1f-5f7b2c75d176"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.803419 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f8b30b23-032e-47bd-bb1f-5f7b2c75d176" (UID: "f8b30b23-032e-47bd-bb1f-5f7b2c75d176"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.826650 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.826692 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.826707 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.826719 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.826733 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkpr6\" (UniqueName: \"kubernetes.io/projected/f8b30b23-032e-47bd-bb1f-5f7b2c75d176-kube-api-access-kkpr6\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.991805 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:58 crc kubenswrapper[4810]: I1003 08:55:58.997568 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-dd55c49f8-dnnqz"] Oct 03 08:55:59 crc kubenswrapper[4810]: I1003 08:55:59.311875 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" path="/var/lib/kubelet/pods/f8b30b23-032e-47bd-bb1f-5f7b2c75d176/volumes" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.228300 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-w96tq"] Oct 03 08:56:03 crc kubenswrapper[4810]: E1003 08:56:03.229498 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="init" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229518 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="init" Oct 03 08:56:03 crc kubenswrapper[4810]: E1003 08:56:03.229542 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-httpd" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229550 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-httpd" Oct 03 08:56:03 crc kubenswrapper[4810]: E1003 08:56:03.229566 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="dnsmasq-dns" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229576 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="dnsmasq-dns" Oct 03 08:56:03 crc kubenswrapper[4810]: E1003 08:56:03.229587 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-api" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229595 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-api" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229789 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-api" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229813 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b30b23-032e-47bd-bb1f-5f7b2c75d176" containerName="neutron-httpd" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.229832 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9746ac25-a0c6-41c4-bf75-62f4f1ee51ce" containerName="dnsmasq-dns" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.230616 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.233191 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.233230 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.233485 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.236200 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.246206 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-452xc" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.260445 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w96tq"] Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306720 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306783 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306810 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306836 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306872 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkqj\" (UniqueName: \"kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306930 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.306990 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.326080 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.327954 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.360036 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408457 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbvj\" (UniqueName: \"kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408559 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408638 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408654 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408682 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkqj\" (UniqueName: \"kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408706 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408749 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408779 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408802 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.408829 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.409756 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.409791 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.414000 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.421600 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.428052 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.429437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.439642 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkqj\" (UniqueName: \"kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj\") pod \"swift-ring-rebalance-w96tq\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.510588 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.510660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.510700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.510748 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbvj\" (UniqueName: \"kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.510820 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.511659 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.511678 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.511913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.512176 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.527880 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbvj\" (UniqueName: \"kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj\") pod \"dnsmasq-dns-7fc544d64f-4wv6k\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.560925 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:03 crc kubenswrapper[4810]: I1003 08:56:03.671944 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.045815 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-w96tq"] Oct 03 08:56:04 crc kubenswrapper[4810]: W1003 08:56:04.047303 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd780816a_47b4_4278_a386_d7172d7aa0a0.slice/crio-b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772 WatchSource:0}: Error finding container b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772: Status 404 returned error can't find the container with id b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772 Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.050173 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.179881 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:56:04 crc kubenswrapper[4810]: W1003 08:56:04.202720 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc507139c_73f6_4921_979f_026491697652.slice/crio-f1660f6e64101c26eb8505395ac83749f554408cbadf0826120c70f700ebbfbb WatchSource:0}: Error finding container f1660f6e64101c26eb8505395ac83749f554408cbadf0826120c70f700ebbfbb: Status 404 returned error can't find the container with id f1660f6e64101c26eb8505395ac83749f554408cbadf0826120c70f700ebbfbb Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.714388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w96tq" event={"ID":"d780816a-47b4-4278-a386-d7172d7aa0a0","Type":"ContainerStarted","Data":"b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772"} Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.716617 4810 generic.go:334] "Generic (PLEG): container finished" podID="c507139c-73f6-4921-979f-026491697652" containerID="0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc" exitCode=0 Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.716656 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" event={"ID":"c507139c-73f6-4921-979f-026491697652","Type":"ContainerDied","Data":"0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc"} Oct 03 08:56:04 crc kubenswrapper[4810]: I1003 08:56:04.716680 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" event={"ID":"c507139c-73f6-4921-979f-026491697652","Type":"ContainerStarted","Data":"f1660f6e64101c26eb8505395ac83749f554408cbadf0826120c70f700ebbfbb"} Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.314932 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.319075 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.322209 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.342142 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.453335 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.453935 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.453995 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.454079 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7sj\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.454242 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.454293 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556072 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556128 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556150 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556192 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r7sj\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556259 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.556284 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.557104 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.557351 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.563675 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.563922 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.564180 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.579443 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r7sj\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj\") pod \"swift-proxy-8576cf7ff5-srfrx\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.641079 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.741990 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" event={"ID":"c507139c-73f6-4921-979f-026491697652","Type":"ContainerStarted","Data":"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b"} Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.742544 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:05 crc kubenswrapper[4810]: I1003 08:56:05.760026 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" podStartSLOduration=2.7600069400000002 podStartE2EDuration="2.76000694s" podCreationTimestamp="2025-10-03 08:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:56:05.759164857 +0000 UTC m=+7199.186415592" watchObservedRunningTime="2025-10-03 08:56:05.76000694 +0000 UTC m=+7199.187257675" Oct 03 08:56:06 crc kubenswrapper[4810]: I1003 08:56:06.214202 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:07 crc kubenswrapper[4810]: I1003 08:56:07.311758 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:56:07 crc kubenswrapper[4810]: E1003 08:56:07.312599 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:56:08 crc kubenswrapper[4810]: W1003 08:56:08.221081 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc07b51f_4e74_4100_8163_eda4588540ad.slice/crio-82578fdaa5ccb806dad8632e4e90a35b3f0d9c164e1a1f53ec571644c2fc59b6 WatchSource:0}: Error finding container 82578fdaa5ccb806dad8632e4e90a35b3f0d9c164e1a1f53ec571644c2fc59b6: Status 404 returned error can't find the container with id 82578fdaa5ccb806dad8632e4e90a35b3f0d9c164e1a1f53ec571644c2fc59b6 Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.370431 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.375117 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.377781 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.377853 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.378043 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517616 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517669 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517714 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517828 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517855 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8scx6\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517873 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.517945 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.518031 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620146 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620219 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620248 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8scx6\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620267 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620341 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620358 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620797 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.620856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.621727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.629978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.639505 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.639722 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.640686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.641450 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8scx6\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.642445 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data\") pod \"swift-proxy-5cf66c6fc5-c9szq\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.775805 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w96tq" event={"ID":"d780816a-47b4-4278-a386-d7172d7aa0a0","Type":"ContainerStarted","Data":"c83c3fd2be32d74daf1dccbe626b87b0104daacbbfa8a4c68ca2d115aefcc9c6"} Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.777592 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerStarted","Data":"43d803fdf4b2583a33c0209da00f0d726a8fa5f89a6b2fdb9e2d8feedc78c794"} Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.777625 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerStarted","Data":"82578fdaa5ccb806dad8632e4e90a35b3f0d9c164e1a1f53ec571644c2fc59b6"} Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.794122 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-w96tq" podStartSLOduration=1.5767975889999999 podStartE2EDuration="5.794100578s" podCreationTimestamp="2025-10-03 08:56:03 +0000 UTC" firstStartedPulling="2025-10-03 08:56:04.049860681 +0000 UTC m=+7197.477111416" lastFinishedPulling="2025-10-03 08:56:08.26716367 +0000 UTC m=+7201.694414405" observedRunningTime="2025-10-03 08:56:08.790374898 +0000 UTC m=+7202.217625623" watchObservedRunningTime="2025-10-03 08:56:08.794100578 +0000 UTC m=+7202.221351323" Oct 03 08:56:08 crc kubenswrapper[4810]: I1003 08:56:08.936833 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.482869 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.793708 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerStarted","Data":"8cf07646ff306cdf609d474d68039d8801f335bdbecca3e2bd309e6dbcf11286"} Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.794061 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.794139 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.799324 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerStarted","Data":"ccaddef8fdc23da0618108ad3920e7dc36166714d51af18712d008fbb1ee8cf6"} Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.800181 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerStarted","Data":"fdad967f08f16f22a27f70ae2b7d9f46686c4a201b26d0d6859a9f12cbb4fe92"} Oct 03 08:56:09 crc kubenswrapper[4810]: I1003 08:56:09.829072 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-8576cf7ff5-srfrx" podStartSLOduration=4.62176918 podStartE2EDuration="4.829054047s" podCreationTimestamp="2025-10-03 08:56:05 +0000 UTC" firstStartedPulling="2025-10-03 08:56:08.224132121 +0000 UTC m=+7201.651382856" lastFinishedPulling="2025-10-03 08:56:08.431416978 +0000 UTC m=+7201.858667723" observedRunningTime="2025-10-03 08:56:09.813578774 +0000 UTC m=+7203.240829519" watchObservedRunningTime="2025-10-03 08:56:09.829054047 +0000 UTC m=+7203.256304782" Oct 03 08:56:10 crc kubenswrapper[4810]: I1003 08:56:10.810311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerStarted","Data":"276347e95ea2f8ef1722075a72ac28686ed4623fefd6f74a753d70385f27ad2e"} Oct 03 08:56:10 crc kubenswrapper[4810]: I1003 08:56:10.811217 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:10 crc kubenswrapper[4810]: I1003 08:56:10.811237 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:10 crc kubenswrapper[4810]: I1003 08:56:10.836868 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podStartSLOduration=2.8368490509999997 podStartE2EDuration="2.836849051s" podCreationTimestamp="2025-10-03 08:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:56:10.827372428 +0000 UTC m=+7204.254623163" watchObservedRunningTime="2025-10-03 08:56:10.836849051 +0000 UTC m=+7204.264099796" Oct 03 08:56:12 crc kubenswrapper[4810]: I1003 08:56:12.824773 4810 generic.go:334] "Generic (PLEG): container finished" podID="d780816a-47b4-4278-a386-d7172d7aa0a0" containerID="c83c3fd2be32d74daf1dccbe626b87b0104daacbbfa8a4c68ca2d115aefcc9c6" exitCode=0 Oct 03 08:56:12 crc kubenswrapper[4810]: I1003 08:56:12.824854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w96tq" event={"ID":"d780816a-47b4-4278-a386-d7172d7aa0a0","Type":"ContainerDied","Data":"c83c3fd2be32d74daf1dccbe626b87b0104daacbbfa8a4c68ca2d115aefcc9c6"} Oct 03 08:56:13 crc kubenswrapper[4810]: I1003 08:56:13.674073 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:56:13 crc kubenswrapper[4810]: I1003 08:56:13.762036 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:56:13 crc kubenswrapper[4810]: I1003 08:56:13.762737 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="dnsmasq-dns" containerID="cri-o://25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087" gracePeriod=10 Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.278000 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.287488 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330149 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330207 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config\") pod \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330234 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330315 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwkqj\" (UniqueName: \"kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330354 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb75z\" (UniqueName: \"kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z\") pod \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330386 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330420 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb\") pod \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330451 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc\") pod \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330472 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb\") pod \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\" (UID: \"86bef6b2-bea6-4405-a2e7-6b033769cbdb\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.330550 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf\") pod \"d780816a-47b4-4278-a386-d7172d7aa0a0\" (UID: \"d780816a-47b4-4278-a386-d7172d7aa0a0\") " Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.331221 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.332248 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.337936 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z" (OuterVolumeSpecName: "kube-api-access-cb75z") pod "86bef6b2-bea6-4405-a2e7-6b033769cbdb" (UID: "86bef6b2-bea6-4405-a2e7-6b033769cbdb"). InnerVolumeSpecName "kube-api-access-cb75z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.363169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj" (OuterVolumeSpecName: "kube-api-access-vwkqj") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "kube-api-access-vwkqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.373078 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.373117 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.375325 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts" (OuterVolumeSpecName: "scripts") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.382840 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d780816a-47b4-4278-a386-d7172d7aa0a0" (UID: "d780816a-47b4-4278-a386-d7172d7aa0a0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.397483 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86bef6b2-bea6-4405-a2e7-6b033769cbdb" (UID: "86bef6b2-bea6-4405-a2e7-6b033769cbdb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.408013 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86bef6b2-bea6-4405-a2e7-6b033769cbdb" (UID: "86bef6b2-bea6-4405-a2e7-6b033769cbdb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.408335 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86bef6b2-bea6-4405-a2e7-6b033769cbdb" (UID: "86bef6b2-bea6-4405-a2e7-6b033769cbdb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.416576 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config" (OuterVolumeSpecName: "config") pod "86bef6b2-bea6-4405-a2e7-6b033769cbdb" (UID: "86bef6b2-bea6-4405-a2e7-6b033769cbdb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432811 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d780816a-47b4-4278-a386-d7172d7aa0a0-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432848 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432860 4810 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432874 4810 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432885 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwkqj\" (UniqueName: \"kubernetes.io/projected/d780816a-47b4-4278-a386-d7172d7aa0a0-kube-api-access-vwkqj\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432915 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb75z\" (UniqueName: \"kubernetes.io/projected/86bef6b2-bea6-4405-a2e7-6b033769cbdb-kube-api-access-cb75z\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432928 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432939 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432949 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432960 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86bef6b2-bea6-4405-a2e7-6b033769cbdb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432971 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d780816a-47b4-4278-a386-d7172d7aa0a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.432982 4810 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d780816a-47b4-4278-a386-d7172d7aa0a0-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.842982 4810 generic.go:334] "Generic (PLEG): container finished" podID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerID="25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087" exitCode=0 Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.843031 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" event={"ID":"86bef6b2-bea6-4405-a2e7-6b033769cbdb","Type":"ContainerDied","Data":"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087"} Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.843067 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.843090 4810 scope.go:117] "RemoveContainer" containerID="25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.843073 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d9559fbd5-qh82b" event={"ID":"86bef6b2-bea6-4405-a2e7-6b033769cbdb","Type":"ContainerDied","Data":"0b2b9b6bb464e424fdd7a8ff2838c5be22c51ad2d73b6b10ba30f870d7665f34"} Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.845293 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-w96tq" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.845210 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-w96tq" event={"ID":"d780816a-47b4-4278-a386-d7172d7aa0a0","Type":"ContainerDied","Data":"b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772"} Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.845405 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b337cc372fd05dada592a521e113398fb117b5e2fd5b211bfc1979fe26b42772" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.868738 4810 scope.go:117] "RemoveContainer" containerID="0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.883819 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.892488 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d9559fbd5-qh82b"] Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.896069 4810 scope.go:117] "RemoveContainer" containerID="25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087" Oct 03 08:56:14 crc kubenswrapper[4810]: E1003 08:56:14.896605 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087\": container with ID starting with 25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087 not found: ID does not exist" containerID="25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.896698 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087"} err="failed to get container status \"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087\": rpc error: code = NotFound desc = could not find container \"25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087\": container with ID starting with 25e9cbb85cb968363997378a70b759d9527d91f8d8a98c0c1f8e462fedc0b087 not found: ID does not exist" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.896801 4810 scope.go:117] "RemoveContainer" containerID="0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06" Oct 03 08:56:14 crc kubenswrapper[4810]: E1003 08:56:14.897199 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06\": container with ID starting with 0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06 not found: ID does not exist" containerID="0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06" Oct 03 08:56:14 crc kubenswrapper[4810]: I1003 08:56:14.897240 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06"} err="failed to get container status \"0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06\": rpc error: code = NotFound desc = could not find container \"0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06\": container with ID starting with 0cce57c3391082ee6042c8cab87106b5df75fdfcc6648cc01eb8ae3a603fba06 not found: ID does not exist" Oct 03 08:56:15 crc kubenswrapper[4810]: I1003 08:56:15.318141 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" path="/var/lib/kubelet/pods/86bef6b2-bea6-4405-a2e7-6b033769cbdb/volumes" Oct 03 08:56:15 crc kubenswrapper[4810]: I1003 08:56:15.644670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:15 crc kubenswrapper[4810]: I1003 08:56:15.646479 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:16 crc kubenswrapper[4810]: I1003 08:56:16.341044 4810 scope.go:117] "RemoveContainer" containerID="af32633d3f326b2eab77b701fa7cb1357a667fd6b43274f857eefef45aaf2659" Oct 03 08:56:16 crc kubenswrapper[4810]: I1003 08:56:16.363403 4810 scope.go:117] "RemoveContainer" containerID="d53e4485f9d8227abdd2fd4a0ac8deb08a8686275a7607b113f97741a0991768" Oct 03 08:56:18 crc kubenswrapper[4810]: I1003 08:56:18.941680 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:18 crc kubenswrapper[4810]: I1003 08:56:18.945420 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.022485 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.022956 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-8576cf7ff5-srfrx" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-httpd" containerID="cri-o://43d803fdf4b2583a33c0209da00f0d726a8fa5f89a6b2fdb9e2d8feedc78c794" gracePeriod=30 Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.023720 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-8576cf7ff5-srfrx" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-server" containerID="cri-o://8cf07646ff306cdf609d474d68039d8801f335bdbecca3e2bd309e6dbcf11286" gracePeriod=30 Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.889527 4810 generic.go:334] "Generic (PLEG): container finished" podID="dc07b51f-4e74-4100-8163-eda4588540ad" containerID="8cf07646ff306cdf609d474d68039d8801f335bdbecca3e2bd309e6dbcf11286" exitCode=0 Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.889859 4810 generic.go:334] "Generic (PLEG): container finished" podID="dc07b51f-4e74-4100-8163-eda4588540ad" containerID="43d803fdf4b2583a33c0209da00f0d726a8fa5f89a6b2fdb9e2d8feedc78c794" exitCode=0 Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.890049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerDied","Data":"8cf07646ff306cdf609d474d68039d8801f335bdbecca3e2bd309e6dbcf11286"} Oct 03 08:56:19 crc kubenswrapper[4810]: I1003 08:56:19.890121 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerDied","Data":"43d803fdf4b2583a33c0209da00f0d726a8fa5f89a6b2fdb9e2d8feedc78c794"} Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.087133 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152128 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152183 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152233 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152264 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r7sj\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152401 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152444 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd\") pod \"dc07b51f-4e74-4100-8163-eda4588540ad\" (UID: \"dc07b51f-4e74-4100-8163-eda4588540ad\") " Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152815 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.152861 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.153303 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.153328 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc07b51f-4e74-4100-8163-eda4588540ad-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.161157 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.161229 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj" (OuterVolumeSpecName: "kube-api-access-2r7sj") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "kube-api-access-2r7sj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.207237 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data" (OuterVolumeSpecName: "config-data") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.222452 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc07b51f-4e74-4100-8163-eda4588540ad" (UID: "dc07b51f-4e74-4100-8163-eda4588540ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.255213 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.255253 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.255265 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r7sj\" (UniqueName: \"kubernetes.io/projected/dc07b51f-4e74-4100-8163-eda4588540ad-kube-api-access-2r7sj\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.255276 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc07b51f-4e74-4100-8163-eda4588540ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.303125 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:56:20 crc kubenswrapper[4810]: E1003 08:56:20.303431 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.899641 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-8576cf7ff5-srfrx" event={"ID":"dc07b51f-4e74-4100-8163-eda4588540ad","Type":"ContainerDied","Data":"82578fdaa5ccb806dad8632e4e90a35b3f0d9c164e1a1f53ec571644c2fc59b6"} Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.899816 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-8576cf7ff5-srfrx" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.900052 4810 scope.go:117] "RemoveContainer" containerID="8cf07646ff306cdf609d474d68039d8801f335bdbecca3e2bd309e6dbcf11286" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.922393 4810 scope.go:117] "RemoveContainer" containerID="43d803fdf4b2583a33c0209da00f0d726a8fa5f89a6b2fdb9e2d8feedc78c794" Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.937186 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:20 crc kubenswrapper[4810]: I1003 08:56:20.944876 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-8576cf7ff5-srfrx"] Oct 03 08:56:21 crc kubenswrapper[4810]: I1003 08:56:21.313572 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" path="/var/lib/kubelet/pods/dc07b51f-4e74-4100-8163-eda4588540ad/volumes" Oct 03 08:56:32 crc kubenswrapper[4810]: I1003 08:56:32.303235 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:56:32 crc kubenswrapper[4810]: E1003 08:56:32.304157 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:56:46 crc kubenswrapper[4810]: I1003 08:56:46.305668 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:56:46 crc kubenswrapper[4810]: E1003 08:56:46.308717 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.017767 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lrm7c"] Oct 03 08:56:51 crc kubenswrapper[4810]: E1003 08:56:51.019826 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-server" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020004 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-server" Oct 03 08:56:51 crc kubenswrapper[4810]: E1003 08:56:51.020091 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d780816a-47b4-4278-a386-d7172d7aa0a0" containerName="swift-ring-rebalance" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020161 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d780816a-47b4-4278-a386-d7172d7aa0a0" containerName="swift-ring-rebalance" Oct 03 08:56:51 crc kubenswrapper[4810]: E1003 08:56:51.020262 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-httpd" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020337 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-httpd" Oct 03 08:56:51 crc kubenswrapper[4810]: E1003 08:56:51.020409 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="init" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020486 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="init" Oct 03 08:56:51 crc kubenswrapper[4810]: E1003 08:56:51.020572 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="dnsmasq-dns" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020642 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="dnsmasq-dns" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.020938 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d780816a-47b4-4278-a386-d7172d7aa0a0" containerName="swift-ring-rebalance" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.021055 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="86bef6b2-bea6-4405-a2e7-6b033769cbdb" containerName="dnsmasq-dns" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.021141 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-httpd" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.021220 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc07b51f-4e74-4100-8163-eda4588540ad" containerName="proxy-server" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.022036 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.026625 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lrm7c"] Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.135557 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb5z8\" (UniqueName: \"kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8\") pod \"cinder-db-create-lrm7c\" (UID: \"e196d4f5-94ac-4345-8231-4b7e71f0ea1d\") " pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.237325 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb5z8\" (UniqueName: \"kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8\") pod \"cinder-db-create-lrm7c\" (UID: \"e196d4f5-94ac-4345-8231-4b7e71f0ea1d\") " pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.267913 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb5z8\" (UniqueName: \"kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8\") pod \"cinder-db-create-lrm7c\" (UID: \"e196d4f5-94ac-4345-8231-4b7e71f0ea1d\") " pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.340153 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:51 crc kubenswrapper[4810]: I1003 08:56:51.775942 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lrm7c"] Oct 03 08:56:52 crc kubenswrapper[4810]: I1003 08:56:52.152533 4810 generic.go:334] "Generic (PLEG): container finished" podID="e196d4f5-94ac-4345-8231-4b7e71f0ea1d" containerID="bc3a0f56765ba9c994d5d365132eda2046effd5dc5d39bff5bc874426f4b6f6c" exitCode=0 Oct 03 08:56:52 crc kubenswrapper[4810]: I1003 08:56:52.152585 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrm7c" event={"ID":"e196d4f5-94ac-4345-8231-4b7e71f0ea1d","Type":"ContainerDied","Data":"bc3a0f56765ba9c994d5d365132eda2046effd5dc5d39bff5bc874426f4b6f6c"} Oct 03 08:56:52 crc kubenswrapper[4810]: I1003 08:56:52.152613 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrm7c" event={"ID":"e196d4f5-94ac-4345-8231-4b7e71f0ea1d","Type":"ContainerStarted","Data":"06d91f6e64c133288fa33f29591809754a2f89057742348dfd11e3f7640dc180"} Oct 03 08:56:53 crc kubenswrapper[4810]: I1003 08:56:53.461388 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:53 crc kubenswrapper[4810]: I1003 08:56:53.477494 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb5z8\" (UniqueName: \"kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8\") pod \"e196d4f5-94ac-4345-8231-4b7e71f0ea1d\" (UID: \"e196d4f5-94ac-4345-8231-4b7e71f0ea1d\") " Oct 03 08:56:53 crc kubenswrapper[4810]: I1003 08:56:53.483688 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8" (OuterVolumeSpecName: "kube-api-access-vb5z8") pod "e196d4f5-94ac-4345-8231-4b7e71f0ea1d" (UID: "e196d4f5-94ac-4345-8231-4b7e71f0ea1d"). InnerVolumeSpecName "kube-api-access-vb5z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:53 crc kubenswrapper[4810]: I1003 08:56:53.580097 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb5z8\" (UniqueName: \"kubernetes.io/projected/e196d4f5-94ac-4345-8231-4b7e71f0ea1d-kube-api-access-vb5z8\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4810]: I1003 08:56:54.170918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lrm7c" event={"ID":"e196d4f5-94ac-4345-8231-4b7e71f0ea1d","Type":"ContainerDied","Data":"06d91f6e64c133288fa33f29591809754a2f89057742348dfd11e3f7640dc180"} Oct 03 08:56:54 crc kubenswrapper[4810]: I1003 08:56:54.171210 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06d91f6e64c133288fa33f29591809754a2f89057742348dfd11e3f7640dc180" Oct 03 08:56:54 crc kubenswrapper[4810]: I1003 08:56:54.170984 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lrm7c" Oct 03 08:56:57 crc kubenswrapper[4810]: I1003 08:56:57.307193 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:56:57 crc kubenswrapper[4810]: E1003 08:56:57.307688 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.117959 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4b2d-account-create-5hn74"] Oct 03 08:57:01 crc kubenswrapper[4810]: E1003 08:57:01.118716 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e196d4f5-94ac-4345-8231-4b7e71f0ea1d" containerName="mariadb-database-create" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.118756 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e196d4f5-94ac-4345-8231-4b7e71f0ea1d" containerName="mariadb-database-create" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.119015 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e196d4f5-94ac-4345-8231-4b7e71f0ea1d" containerName="mariadb-database-create" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.119708 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.122790 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.128518 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4b2d-account-create-5hn74"] Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.208680 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j762j\" (UniqueName: \"kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j\") pod \"cinder-4b2d-account-create-5hn74\" (UID: \"da33a722-8402-4b5c-acce-d1c1c974278c\") " pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.310992 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j762j\" (UniqueName: \"kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j\") pod \"cinder-4b2d-account-create-5hn74\" (UID: \"da33a722-8402-4b5c-acce-d1c1c974278c\") " pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.331450 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j762j\" (UniqueName: \"kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j\") pod \"cinder-4b2d-account-create-5hn74\" (UID: \"da33a722-8402-4b5c-acce-d1c1c974278c\") " pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.448150 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:01 crc kubenswrapper[4810]: I1003 08:57:01.876534 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4b2d-account-create-5hn74"] Oct 03 08:57:02 crc kubenswrapper[4810]: I1003 08:57:02.251200 4810 generic.go:334] "Generic (PLEG): container finished" podID="da33a722-8402-4b5c-acce-d1c1c974278c" containerID="ef9a092d8564e390c88525c36f006b321ef2f7ce62522db6f48f417ef6ff5929" exitCode=0 Oct 03 08:57:02 crc kubenswrapper[4810]: I1003 08:57:02.251489 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4b2d-account-create-5hn74" event={"ID":"da33a722-8402-4b5c-acce-d1c1c974278c","Type":"ContainerDied","Data":"ef9a092d8564e390c88525c36f006b321ef2f7ce62522db6f48f417ef6ff5929"} Oct 03 08:57:02 crc kubenswrapper[4810]: I1003 08:57:02.251517 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4b2d-account-create-5hn74" event={"ID":"da33a722-8402-4b5c-acce-d1c1c974278c","Type":"ContainerStarted","Data":"d773ac77e28cc54e468694841c0bb8a0ff7895862d80149c6cb1940b622edcd3"} Oct 03 08:57:03 crc kubenswrapper[4810]: I1003 08:57:03.584791 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:03 crc kubenswrapper[4810]: I1003 08:57:03.757485 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j762j\" (UniqueName: \"kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j\") pod \"da33a722-8402-4b5c-acce-d1c1c974278c\" (UID: \"da33a722-8402-4b5c-acce-d1c1c974278c\") " Oct 03 08:57:03 crc kubenswrapper[4810]: I1003 08:57:03.765085 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j" (OuterVolumeSpecName: "kube-api-access-j762j") pod "da33a722-8402-4b5c-acce-d1c1c974278c" (UID: "da33a722-8402-4b5c-acce-d1c1c974278c"). InnerVolumeSpecName "kube-api-access-j762j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:03 crc kubenswrapper[4810]: I1003 08:57:03.859279 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j762j\" (UniqueName: \"kubernetes.io/projected/da33a722-8402-4b5c-acce-d1c1c974278c-kube-api-access-j762j\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:04 crc kubenswrapper[4810]: I1003 08:57:04.275182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4b2d-account-create-5hn74" event={"ID":"da33a722-8402-4b5c-acce-d1c1c974278c","Type":"ContainerDied","Data":"d773ac77e28cc54e468694841c0bb8a0ff7895862d80149c6cb1940b622edcd3"} Oct 03 08:57:04 crc kubenswrapper[4810]: I1003 08:57:04.275224 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d773ac77e28cc54e468694841c0bb8a0ff7895862d80149c6cb1940b622edcd3" Oct 03 08:57:04 crc kubenswrapper[4810]: I1003 08:57:04.275273 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4b2d-account-create-5hn74" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.261513 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-5k6fj"] Oct 03 08:57:06 crc kubenswrapper[4810]: E1003 08:57:06.262373 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da33a722-8402-4b5c-acce-d1c1c974278c" containerName="mariadb-account-create" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.262387 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="da33a722-8402-4b5c-acce-d1c1c974278c" containerName="mariadb-account-create" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.262561 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="da33a722-8402-4b5c-acce-d1c1c974278c" containerName="mariadb-account-create" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.263260 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.266076 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x5rdg" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.272524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5k6fj"] Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.280101 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.280857 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.403434 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.403479 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.403532 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.404063 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvwmc\" (UniqueName: \"kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.404303 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.404432 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.506773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvwmc\" (UniqueName: \"kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.507300 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.507460 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.507727 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.507773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.507868 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.508021 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.513915 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.514431 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.514626 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.517640 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.531953 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvwmc\" (UniqueName: \"kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc\") pod \"cinder-db-sync-5k6fj\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:06 crc kubenswrapper[4810]: I1003 08:57:06.596663 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:07 crc kubenswrapper[4810]: I1003 08:57:07.062063 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-5k6fj"] Oct 03 08:57:07 crc kubenswrapper[4810]: I1003 08:57:07.324169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5k6fj" event={"ID":"4bda40fe-b70d-45a5-a292-a1a27fae5e10","Type":"ContainerStarted","Data":"2132315ec58bf8e4590381395c3c697103c3394dea6a5dbce77725fc83127120"} Oct 03 08:57:08 crc kubenswrapper[4810]: I1003 08:57:08.302671 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:57:08 crc kubenswrapper[4810]: E1003 08:57:08.303230 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:57:21 crc kubenswrapper[4810]: I1003 08:57:21.302880 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:57:21 crc kubenswrapper[4810]: E1003 08:57:21.303941 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:57:27 crc kubenswrapper[4810]: E1003 08:57:27.777076 4810 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:88dc57612f447daadb492dcf3ad854ac" Oct 03 08:57:27 crc kubenswrapper[4810]: E1003 08:57:27.777955 4810 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:88dc57612f447daadb492dcf3ad854ac" Oct 03 08:57:27 crc kubenswrapper[4810]: E1003 08:57:27.778373 4810 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:88dc57612f447daadb492dcf3ad854ac,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvwmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-5k6fj_openstack(4bda40fe-b70d-45a5-a292-a1a27fae5e10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:57:27 crc kubenswrapper[4810]: E1003 08:57:27.779680 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-5k6fj" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" Oct 03 08:57:28 crc kubenswrapper[4810]: E1003 08:57:28.535719 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:88dc57612f447daadb492dcf3ad854ac\\\"\"" pod="openstack/cinder-db-sync-5k6fj" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" Oct 03 08:57:33 crc kubenswrapper[4810]: I1003 08:57:33.304061 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:57:33 crc kubenswrapper[4810]: E1003 08:57:33.305076 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.684605 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.690256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5k6fj" event={"ID":"4bda40fe-b70d-45a5-a292-a1a27fae5e10","Type":"ContainerStarted","Data":"b0d216d372f54f7525fe772f1bffc384d325ecc2daa15973b2e367b10c0924e3"} Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.690821 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.699851 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.740356 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-5k6fj" podStartSLOduration=2.271263157 podStartE2EDuration="38.740338618s" podCreationTimestamp="2025-10-03 08:57:06 +0000 UTC" firstStartedPulling="2025-10-03 08:57:07.042387476 +0000 UTC m=+7260.469638211" lastFinishedPulling="2025-10-03 08:57:43.511462937 +0000 UTC m=+7296.938713672" observedRunningTime="2025-10-03 08:57:44.736763492 +0000 UTC m=+7298.164014227" watchObservedRunningTime="2025-10-03 08:57:44.740338618 +0000 UTC m=+7298.167589353" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.802794 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.803073 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdg8\" (UniqueName: \"kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.803214 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.905566 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.905650 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdg8\" (UniqueName: \"kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.905708 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.906185 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.906204 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:44 crc kubenswrapper[4810]: I1003 08:57:44.924627 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdg8\" (UniqueName: \"kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8\") pod \"certified-operators-g9mts\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:45 crc kubenswrapper[4810]: I1003 08:57:45.017178 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:45 crc kubenswrapper[4810]: I1003 08:57:45.307713 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:57:45 crc kubenswrapper[4810]: E1003 08:57:45.308351 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:57:45 crc kubenswrapper[4810]: W1003 08:57:45.542052 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod013a4732_db1c_4da7_9282_fa56fd2088a2.slice/crio-d7bc71c7045e5821b182f2ac43cc2edd132a57634959a70154e17554ca040ee9 WatchSource:0}: Error finding container d7bc71c7045e5821b182f2ac43cc2edd132a57634959a70154e17554ca040ee9: Status 404 returned error can't find the container with id d7bc71c7045e5821b182f2ac43cc2edd132a57634959a70154e17554ca040ee9 Oct 03 08:57:45 crc kubenswrapper[4810]: I1003 08:57:45.542793 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:45 crc kubenswrapper[4810]: I1003 08:57:45.699267 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerStarted","Data":"d7bc71c7045e5821b182f2ac43cc2edd132a57634959a70154e17554ca040ee9"} Oct 03 08:57:46 crc kubenswrapper[4810]: I1003 08:57:46.714170 4810 generic.go:334] "Generic (PLEG): container finished" podID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" containerID="b0d216d372f54f7525fe772f1bffc384d325ecc2daa15973b2e367b10c0924e3" exitCode=0 Oct 03 08:57:46 crc kubenswrapper[4810]: I1003 08:57:46.714241 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5k6fj" event={"ID":"4bda40fe-b70d-45a5-a292-a1a27fae5e10","Type":"ContainerDied","Data":"b0d216d372f54f7525fe772f1bffc384d325ecc2daa15973b2e367b10c0924e3"} Oct 03 08:57:46 crc kubenswrapper[4810]: I1003 08:57:46.716262 4810 generic.go:334] "Generic (PLEG): container finished" podID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerID="e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483" exitCode=0 Oct 03 08:57:46 crc kubenswrapper[4810]: I1003 08:57:46.716311 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerDied","Data":"e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483"} Oct 03 08:57:47 crc kubenswrapper[4810]: I1003 08:57:47.726829 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerStarted","Data":"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9"} Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.049064 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080335 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080379 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080403 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvwmc\" (UniqueName: \"kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080650 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.080705 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle\") pod \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\" (UID: \"4bda40fe-b70d-45a5-a292-a1a27fae5e10\") " Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.081241 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.086326 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.086402 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts" (OuterVolumeSpecName: "scripts") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.086515 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc" (OuterVolumeSpecName: "kube-api-access-lvwmc") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "kube-api-access-lvwmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.109511 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.128732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data" (OuterVolumeSpecName: "config-data") pod "4bda40fe-b70d-45a5-a292-a1a27fae5e10" (UID: "4bda40fe-b70d-45a5-a292-a1a27fae5e10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.182942 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.182979 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.182988 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.182996 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4bda40fe-b70d-45a5-a292-a1a27fae5e10-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.183007 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvwmc\" (UniqueName: \"kubernetes.io/projected/4bda40fe-b70d-45a5-a292-a1a27fae5e10-kube-api-access-lvwmc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.183017 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bda40fe-b70d-45a5-a292-a1a27fae5e10-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.736471 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-5k6fj" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.736461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-5k6fj" event={"ID":"4bda40fe-b70d-45a5-a292-a1a27fae5e10","Type":"ContainerDied","Data":"2132315ec58bf8e4590381395c3c697103c3394dea6a5dbce77725fc83127120"} Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.736616 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2132315ec58bf8e4590381395c3c697103c3394dea6a5dbce77725fc83127120" Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.738993 4810 generic.go:334] "Generic (PLEG): container finished" podID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerID="d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9" exitCode=0 Oct 03 08:57:48 crc kubenswrapper[4810]: I1003 08:57:48.739041 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerDied","Data":"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9"} Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.408831 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:57:49 crc kubenswrapper[4810]: E1003 08:57:49.411966 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" containerName="cinder-db-sync" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.411987 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" containerName="cinder-db-sync" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.412379 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" containerName="cinder-db-sync" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.416650 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.439081 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.559770 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.562243 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.564976 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.568526 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x5rdg" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.568620 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.569084 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.575952 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.617715 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.617794 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8tp\" (UniqueName: \"kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.617849 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.617869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.617968 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720188 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720255 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720280 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720338 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtnpv\" (UniqueName: \"kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720359 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8tp\" (UniqueName: \"kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720381 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720400 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720417 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720456 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720478 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.720529 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.721596 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.721627 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.721825 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.722143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.755472 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8tp\" (UniqueName: \"kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp\") pod \"dnsmasq-dns-58b5588555-t7ffc\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.772948 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822616 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822711 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822770 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtnpv\" (UniqueName: \"kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822810 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.822920 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.823039 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.824647 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.826842 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.828316 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.830436 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.833624 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.844851 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtnpv\" (UniqueName: \"kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv\") pod \"cinder-api-0\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " pod="openstack/cinder-api-0" Oct 03 08:57:49 crc kubenswrapper[4810]: I1003 08:57:49.888184 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.292543 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.425524 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:50 crc kubenswrapper[4810]: W1003 08:57:50.430211 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod768258c1_acd5_41a9_a813_2761974c32d5.slice/crio-12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360 WatchSource:0}: Error finding container 12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360: Status 404 returned error can't find the container with id 12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360 Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.764342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerStarted","Data":"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f"} Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.770304 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerStarted","Data":"12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360"} Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.783621 4810 generic.go:334] "Generic (PLEG): container finished" podID="afb6bb02-2adf-4187-9461-282a461bfd23" containerID="312bac1b8a387e387f2de5c3d4db9bb82681f68f4799b5e3e4809f40b7ed6d90" exitCode=0 Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.783691 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" event={"ID":"afb6bb02-2adf-4187-9461-282a461bfd23","Type":"ContainerDied","Data":"312bac1b8a387e387f2de5c3d4db9bb82681f68f4799b5e3e4809f40b7ed6d90"} Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.783758 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" event={"ID":"afb6bb02-2adf-4187-9461-282a461bfd23","Type":"ContainerStarted","Data":"e499fc2e695078f457daa0046493268b670c7fb2dde7b7b361b452534058adc9"} Oct 03 08:57:50 crc kubenswrapper[4810]: I1003 08:57:50.792475 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g9mts" podStartSLOduration=3.879199175 podStartE2EDuration="6.792450374s" podCreationTimestamp="2025-10-03 08:57:44 +0000 UTC" firstStartedPulling="2025-10-03 08:57:46.718450415 +0000 UTC m=+7300.145701150" lastFinishedPulling="2025-10-03 08:57:49.631701614 +0000 UTC m=+7303.058952349" observedRunningTime="2025-10-03 08:57:50.780210307 +0000 UTC m=+7304.207461052" watchObservedRunningTime="2025-10-03 08:57:50.792450374 +0000 UTC m=+7304.219701109" Oct 03 08:57:51 crc kubenswrapper[4810]: I1003 08:57:51.585321 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:51 crc kubenswrapper[4810]: I1003 08:57:51.807470 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerStarted","Data":"ffc55cc6f13bda534a3bddbd2bbd6967e5152638c775c3a6f3127783f9fc9c4a"} Oct 03 08:57:51 crc kubenswrapper[4810]: I1003 08:57:51.826935 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" event={"ID":"afb6bb02-2adf-4187-9461-282a461bfd23","Type":"ContainerStarted","Data":"81d66d489d309b6cbd778c624c7816650200623b0f4c2bf3a68ef11fdf934462"} Oct 03 08:57:51 crc kubenswrapper[4810]: I1003 08:57:51.827005 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.836554 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api-log" containerID="cri-o://ffc55cc6f13bda534a3bddbd2bbd6967e5152638c775c3a6f3127783f9fc9c4a" gracePeriod=30 Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.836998 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerStarted","Data":"4482b834e96ae5d1998f4490aac21c902880d5d6231a7c4e7cd7109e90397dd2"} Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.837261 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api" containerID="cri-o://4482b834e96ae5d1998f4490aac21c902880d5d6231a7c4e7cd7109e90397dd2" gracePeriod=30 Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.839779 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.869166 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" podStartSLOduration=3.869150385 podStartE2EDuration="3.869150385s" podCreationTimestamp="2025-10-03 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:51.873231388 +0000 UTC m=+7305.300482123" watchObservedRunningTime="2025-10-03 08:57:52.869150385 +0000 UTC m=+7306.296401120" Oct 03 08:57:52 crc kubenswrapper[4810]: I1003 08:57:52.871737 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.871727823 podStartE2EDuration="3.871727823s" podCreationTimestamp="2025-10-03 08:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:52.865387585 +0000 UTC m=+7306.292638330" watchObservedRunningTime="2025-10-03 08:57:52.871727823 +0000 UTC m=+7306.298978558" Oct 03 08:57:53 crc kubenswrapper[4810]: I1003 08:57:53.845835 4810 generic.go:334] "Generic (PLEG): container finished" podID="768258c1-acd5-41a9-a813-2761974c32d5" containerID="ffc55cc6f13bda534a3bddbd2bbd6967e5152638c775c3a6f3127783f9fc9c4a" exitCode=143 Oct 03 08:57:53 crc kubenswrapper[4810]: I1003 08:57:53.845929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerDied","Data":"ffc55cc6f13bda534a3bddbd2bbd6967e5152638c775c3a6f3127783f9fc9c4a"} Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.017328 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.017390 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.067823 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.867241 4810 generic.go:334] "Generic (PLEG): container finished" podID="768258c1-acd5-41a9-a813-2761974c32d5" containerID="4482b834e96ae5d1998f4490aac21c902880d5d6231a7c4e7cd7109e90397dd2" exitCode=0 Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.867460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerDied","Data":"4482b834e96ae5d1998f4490aac21c902880d5d6231a7c4e7cd7109e90397dd2"} Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.921626 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:55 crc kubenswrapper[4810]: I1003 08:57:55.974781 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:56 crc kubenswrapper[4810]: I1003 08:57:56.881424 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"768258c1-acd5-41a9-a813-2761974c32d5","Type":"ContainerDied","Data":"12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360"} Oct 03 08:57:56 crc kubenswrapper[4810]: I1003 08:57:56.881783 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12df25be11227874e9d8835184a8f1411ad3877a0d055d974fa01d91e8e04360" Oct 03 08:57:56 crc kubenswrapper[4810]: I1003 08:57:56.929739 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075239 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075333 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075356 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075440 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075475 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075495 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtnpv\" (UniqueName: \"kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075509 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.075536 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom\") pod \"768258c1-acd5-41a9-a813-2761974c32d5\" (UID: \"768258c1-acd5-41a9-a813-2761974c32d5\") " Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.076381 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs" (OuterVolumeSpecName: "logs") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.080553 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/768258c1-acd5-41a9-a813-2761974c32d5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.080629 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/768258c1-acd5-41a9-a813-2761974c32d5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.099220 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv" (OuterVolumeSpecName: "kube-api-access-gtnpv") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "kube-api-access-gtnpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.099220 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.104397 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts" (OuterVolumeSpecName: "scripts") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.115787 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.163718 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data" (OuterVolumeSpecName: "config-data") pod "768258c1-acd5-41a9-a813-2761974c32d5" (UID: "768258c1-acd5-41a9-a813-2761974c32d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.182261 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.182325 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.182335 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.182345 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtnpv\" (UniqueName: \"kubernetes.io/projected/768258c1-acd5-41a9-a813-2761974c32d5-kube-api-access-gtnpv\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.182357 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/768258c1-acd5-41a9-a813-2761974c32d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.889642 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.889703 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g9mts" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="registry-server" containerID="cri-o://47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f" gracePeriod=2 Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.922575 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.935243 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.947599 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:57 crc kubenswrapper[4810]: E1003 08:57:57.948248 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.948278 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api" Oct 03 08:57:57 crc kubenswrapper[4810]: E1003 08:57:57.948307 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api-log" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.948322 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api-log" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.948639 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api-log" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.948715 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="768258c1-acd5-41a9-a813-2761974c32d5" containerName="cinder-api" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.950259 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.953346 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.953571 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.953597 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.953742 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x5rdg" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.954427 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.954566 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:57:57 crc kubenswrapper[4810]: I1003 08:57:57.969447 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.099949 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.099989 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100029 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100115 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100149 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crlxk\" (UniqueName: \"kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100234 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100299 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.100327 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.201963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crlxk\" (UniqueName: \"kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202018 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202089 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202116 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202180 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202205 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202249 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202283 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.202402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.203247 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.208336 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.210382 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.210410 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.210609 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.210936 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.215873 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.218608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crlxk\" (UniqueName: \"kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk\") pod \"cinder-api-0\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.269219 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.398502 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.506114 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsdg8\" (UniqueName: \"kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8\") pod \"013a4732-db1c-4da7-9282-fa56fd2088a2\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.506645 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content\") pod \"013a4732-db1c-4da7-9282-fa56fd2088a2\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.506828 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities\") pod \"013a4732-db1c-4da7-9282-fa56fd2088a2\" (UID: \"013a4732-db1c-4da7-9282-fa56fd2088a2\") " Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.507461 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities" (OuterVolumeSpecName: "utilities") pod "013a4732-db1c-4da7-9282-fa56fd2088a2" (UID: "013a4732-db1c-4da7-9282-fa56fd2088a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.510462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8" (OuterVolumeSpecName: "kube-api-access-qsdg8") pod "013a4732-db1c-4da7-9282-fa56fd2088a2" (UID: "013a4732-db1c-4da7-9282-fa56fd2088a2"). InnerVolumeSpecName "kube-api-access-qsdg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.574429 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "013a4732-db1c-4da7-9282-fa56fd2088a2" (UID: "013a4732-db1c-4da7-9282-fa56fd2088a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.608877 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.608929 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsdg8\" (UniqueName: \"kubernetes.io/projected/013a4732-db1c-4da7-9282-fa56fd2088a2-kube-api-access-qsdg8\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.608942 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/013a4732-db1c-4da7-9282-fa56fd2088a2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.754346 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.899969 4810 generic.go:334] "Generic (PLEG): container finished" podID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerID="47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f" exitCode=0 Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.900007 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerDied","Data":"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f"} Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.900023 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9mts" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.900040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9mts" event={"ID":"013a4732-db1c-4da7-9282-fa56fd2088a2","Type":"ContainerDied","Data":"d7bc71c7045e5821b182f2ac43cc2edd132a57634959a70154e17554ca040ee9"} Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.900055 4810 scope.go:117] "RemoveContainer" containerID="47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.904532 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerStarted","Data":"548628f4fb93f96b9f4e39814c7de7303de436a7db01c2c1efa11a283ba1eb61"} Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.935458 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.937963 4810 scope.go:117] "RemoveContainer" containerID="d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.948250 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g9mts"] Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.959664 4810 scope.go:117] "RemoveContainer" containerID="e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.988682 4810 scope.go:117] "RemoveContainer" containerID="47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f" Oct 03 08:57:58 crc kubenswrapper[4810]: E1003 08:57:58.989291 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f\": container with ID starting with 47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f not found: ID does not exist" containerID="47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.989386 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f"} err="failed to get container status \"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f\": rpc error: code = NotFound desc = could not find container \"47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f\": container with ID starting with 47d6d22ee995aa84d9b8de106177b634425edc83d05d9da036efa1b29b16803f not found: ID does not exist" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.989434 4810 scope.go:117] "RemoveContainer" containerID="d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9" Oct 03 08:57:58 crc kubenswrapper[4810]: E1003 08:57:58.989977 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9\": container with ID starting with d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9 not found: ID does not exist" containerID="d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.990028 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9"} err="failed to get container status \"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9\": rpc error: code = NotFound desc = could not find container \"d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9\": container with ID starting with d4c616a1891ec300db77f2ad1d7d35a87bdd9be5890a477a929d2b2a6ca2efe9 not found: ID does not exist" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.990066 4810 scope.go:117] "RemoveContainer" containerID="e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483" Oct 03 08:57:58 crc kubenswrapper[4810]: E1003 08:57:58.990430 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483\": container with ID starting with e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483 not found: ID does not exist" containerID="e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483" Oct 03 08:57:58 crc kubenswrapper[4810]: I1003 08:57:58.990463 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483"} err="failed to get container status \"e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483\": rpc error: code = NotFound desc = could not find container \"e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483\": container with ID starting with e4b1891696a4df3235fa9c2efe9879cd8cff98a7f72d8637693d830b77865483 not found: ID does not exist" Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.311579 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" path="/var/lib/kubelet/pods/013a4732-db1c-4da7-9282-fa56fd2088a2/volumes" Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.312702 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768258c1-acd5-41a9-a813-2761974c32d5" path="/var/lib/kubelet/pods/768258c1-acd5-41a9-a813-2761974c32d5/volumes" Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.775087 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.823128 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.823412 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" podUID="c507139c-73f6-4921-979f-026491697652" containerName="dnsmasq-dns" containerID="cri-o://8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b" gracePeriod=10 Oct 03 08:57:59 crc kubenswrapper[4810]: I1003 08:57:59.912626 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerStarted","Data":"0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f"} Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.302945 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:58:00 crc kubenswrapper[4810]: E1003 08:58:00.303359 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.322478 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.452735 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb\") pod \"c507139c-73f6-4921-979f-026491697652\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.452865 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc\") pod \"c507139c-73f6-4921-979f-026491697652\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.452948 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config\") pod \"c507139c-73f6-4921-979f-026491697652\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.453044 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbvj\" (UniqueName: \"kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj\") pod \"c507139c-73f6-4921-979f-026491697652\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.453088 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb\") pod \"c507139c-73f6-4921-979f-026491697652\" (UID: \"c507139c-73f6-4921-979f-026491697652\") " Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.460324 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj" (OuterVolumeSpecName: "kube-api-access-4wbvj") pod "c507139c-73f6-4921-979f-026491697652" (UID: "c507139c-73f6-4921-979f-026491697652"). InnerVolumeSpecName "kube-api-access-4wbvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.499016 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c507139c-73f6-4921-979f-026491697652" (UID: "c507139c-73f6-4921-979f-026491697652"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.499062 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c507139c-73f6-4921-979f-026491697652" (UID: "c507139c-73f6-4921-979f-026491697652"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.501563 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config" (OuterVolumeSpecName: "config") pod "c507139c-73f6-4921-979f-026491697652" (UID: "c507139c-73f6-4921-979f-026491697652"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.513534 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c507139c-73f6-4921-979f-026491697652" (UID: "c507139c-73f6-4921-979f-026491697652"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.555693 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.555732 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.555742 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.555751 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbvj\" (UniqueName: \"kubernetes.io/projected/c507139c-73f6-4921-979f-026491697652-kube-api-access-4wbvj\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.555763 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c507139c-73f6-4921-979f-026491697652-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.927540 4810 generic.go:334] "Generic (PLEG): container finished" podID="c507139c-73f6-4921-979f-026491697652" containerID="8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b" exitCode=0 Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.927625 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" event={"ID":"c507139c-73f6-4921-979f-026491697652","Type":"ContainerDied","Data":"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b"} Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.927656 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" event={"ID":"c507139c-73f6-4921-979f-026491697652","Type":"ContainerDied","Data":"f1660f6e64101c26eb8505395ac83749f554408cbadf0826120c70f700ebbfbb"} Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.927672 4810 scope.go:117] "RemoveContainer" containerID="8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.927626 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fc544d64f-4wv6k" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.932733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerStarted","Data":"396726b91c59f0aac9fad0ff36616e1f70f3c5038a278618904f68c972d347a8"} Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.933261 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.963348 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.963330577 podStartE2EDuration="3.963330577s" podCreationTimestamp="2025-10-03 08:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:00.95104458 +0000 UTC m=+7314.378295315" watchObservedRunningTime="2025-10-03 08:58:00.963330577 +0000 UTC m=+7314.390581312" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.964876 4810 scope.go:117] "RemoveContainer" containerID="0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.975734 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.984293 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fc544d64f-4wv6k"] Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.984985 4810 scope.go:117] "RemoveContainer" containerID="8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b" Oct 03 08:58:00 crc kubenswrapper[4810]: E1003 08:58:00.985453 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b\": container with ID starting with 8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b not found: ID does not exist" containerID="8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.985498 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b"} err="failed to get container status \"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b\": rpc error: code = NotFound desc = could not find container \"8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b\": container with ID starting with 8beb01b2d3035658be24d0bdf863619ada72c7423f52bd2634d7204bb8b1e85b not found: ID does not exist" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.985523 4810 scope.go:117] "RemoveContainer" containerID="0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc" Oct 03 08:58:00 crc kubenswrapper[4810]: E1003 08:58:00.985790 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc\": container with ID starting with 0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc not found: ID does not exist" containerID="0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc" Oct 03 08:58:00 crc kubenswrapper[4810]: I1003 08:58:00.985814 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc"} err="failed to get container status \"0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc\": rpc error: code = NotFound desc = could not find container \"0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc\": container with ID starting with 0f06372802b93bdb6493d03811a091268e3c5c7db918af582d03e909747cb9cc not found: ID does not exist" Oct 03 08:58:01 crc kubenswrapper[4810]: I1003 08:58:01.312699 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c507139c-73f6-4921-979f-026491697652" path="/var/lib/kubelet/pods/c507139c-73f6-4921-979f-026491697652/volumes" Oct 03 08:58:10 crc kubenswrapper[4810]: I1003 08:58:10.126815 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 08:58:11 crc kubenswrapper[4810]: I1003 08:58:11.303076 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:58:11 crc kubenswrapper[4810]: E1003 08:58:11.303957 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:58:16 crc kubenswrapper[4810]: I1003 08:58:16.540549 4810 scope.go:117] "RemoveContainer" containerID="65c477036e034e9595a92ff85fdbb05324349f540a9eb6d7c19dabbb0a8f9653" Oct 03 08:58:26 crc kubenswrapper[4810]: I1003 08:58:26.302417 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:58:26 crc kubenswrapper[4810]: E1003 08:58:26.303446 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.124445 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:31 crc kubenswrapper[4810]: E1003 08:58:31.125451 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="extract-content" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125469 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="extract-content" Oct 03 08:58:31 crc kubenswrapper[4810]: E1003 08:58:31.125483 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="registry-server" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125491 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="registry-server" Oct 03 08:58:31 crc kubenswrapper[4810]: E1003 08:58:31.125529 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="extract-utilities" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125539 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="extract-utilities" Oct 03 08:58:31 crc kubenswrapper[4810]: E1003 08:58:31.125574 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c507139c-73f6-4921-979f-026491697652" containerName="dnsmasq-dns" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125583 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c507139c-73f6-4921-979f-026491697652" containerName="dnsmasq-dns" Oct 03 08:58:31 crc kubenswrapper[4810]: E1003 08:58:31.125592 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c507139c-73f6-4921-979f-026491697652" containerName="init" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125599 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c507139c-73f6-4921-979f-026491697652" containerName="init" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125813 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c507139c-73f6-4921-979f-026491697652" containerName="dnsmasq-dns" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.125846 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="013a4732-db1c-4da7-9282-fa56fd2088a2" containerName="registry-server" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.127007 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.130674 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.134119 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258008 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258088 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258320 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7gq\" (UniqueName: \"kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.258376 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360152 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360304 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7gq\" (UniqueName: \"kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360339 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360421 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360473 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.360608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.375388 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.376503 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.376550 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.387498 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7gq\" (UniqueName: \"kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.388663 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts\") pod \"cinder-scheduler-0\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.453286 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:31 crc kubenswrapper[4810]: I1003 08:58:31.922508 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:32 crc kubenswrapper[4810]: I1003 08:58:32.216366 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerStarted","Data":"c7cfd157b8975bdd6899bd652dcf6cac49597aab16aec2a1f2f8d43e1a2f3fb4"} Oct 03 08:58:32 crc kubenswrapper[4810]: I1003 08:58:32.859145 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:32 crc kubenswrapper[4810]: I1003 08:58:32.859673 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api-log" containerID="cri-o://0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f" gracePeriod=30 Oct 03 08:58:32 crc kubenswrapper[4810]: I1003 08:58:32.859768 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api" containerID="cri-o://396726b91c59f0aac9fad0ff36616e1f70f3c5038a278618904f68c972d347a8" gracePeriod=30 Oct 03 08:58:33 crc kubenswrapper[4810]: E1003 08:58:33.164094 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b47792_9b35_4012_9c8b_cf6a23539bb2.slice/crio-conmon-0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60b47792_9b35_4012_9c8b_cf6a23539bb2.slice/crio-0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:58:33 crc kubenswrapper[4810]: I1003 08:58:33.245935 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerDied","Data":"0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f"} Oct 03 08:58:33 crc kubenswrapper[4810]: I1003 08:58:33.245774 4810 generic.go:334] "Generic (PLEG): container finished" podID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerID="0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f" exitCode=143 Oct 03 08:58:33 crc kubenswrapper[4810]: I1003 08:58:33.249467 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerStarted","Data":"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f"} Oct 03 08:58:34 crc kubenswrapper[4810]: I1003 08:58:34.258742 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerStarted","Data":"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59"} Oct 03 08:58:34 crc kubenswrapper[4810]: I1003 08:58:34.281274 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.978495684 podStartE2EDuration="3.281250822s" podCreationTimestamp="2025-10-03 08:58:31 +0000 UTC" firstStartedPulling="2025-10-03 08:58:31.928548418 +0000 UTC m=+7345.355799153" lastFinishedPulling="2025-10-03 08:58:32.231303556 +0000 UTC m=+7345.658554291" observedRunningTime="2025-10-03 08:58:34.273828664 +0000 UTC m=+7347.701079419" watchObservedRunningTime="2025-10-03 08:58:34.281250822 +0000 UTC m=+7347.708501567" Oct 03 08:58:35 crc kubenswrapper[4810]: I1003 08:58:35.996505 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.67:8776/healthcheck\": read tcp 10.217.0.2:43104->10.217.1.67:8776: read: connection reset by peer" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.281734 4810 generic.go:334] "Generic (PLEG): container finished" podID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerID="396726b91c59f0aac9fad0ff36616e1f70f3c5038a278618904f68c972d347a8" exitCode=0 Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.281777 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerDied","Data":"396726b91c59f0aac9fad0ff36616e1f70f3c5038a278618904f68c972d347a8"} Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.447678 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.453634 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552060 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552172 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552269 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crlxk\" (UniqueName: \"kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552329 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552391 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552423 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552477 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552546 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552594 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle\") pod \"60b47792-9b35-4012-9c8b-cf6a23539bb2\" (UID: \"60b47792-9b35-4012-9c8b-cf6a23539bb2\") " Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552778 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs" (OuterVolumeSpecName: "logs") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.552845 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.553282 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60b47792-9b35-4012-9c8b-cf6a23539bb2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.553306 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60b47792-9b35-4012-9c8b-cf6a23539bb2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.559750 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts" (OuterVolumeSpecName: "scripts") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.563105 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk" (OuterVolumeSpecName: "kube-api-access-crlxk") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "kube-api-access-crlxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.567536 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.602312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.639486 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.640483 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.646946 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data" (OuterVolumeSpecName: "config-data") pod "60b47792-9b35-4012-9c8b-cf6a23539bb2" (UID: "60b47792-9b35-4012-9c8b-cf6a23539bb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655128 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655157 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655169 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655179 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655188 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655196 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60b47792-9b35-4012-9c8b-cf6a23539bb2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:36 crc kubenswrapper[4810]: I1003 08:58:36.655206 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crlxk\" (UniqueName: \"kubernetes.io/projected/60b47792-9b35-4012-9c8b-cf6a23539bb2-kube-api-access-crlxk\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.294362 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"60b47792-9b35-4012-9c8b-cf6a23539bb2","Type":"ContainerDied","Data":"548628f4fb93f96b9f4e39814c7de7303de436a7db01c2c1efa11a283ba1eb61"} Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.294414 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.294571 4810 scope.go:117] "RemoveContainer" containerID="396726b91c59f0aac9fad0ff36616e1f70f3c5038a278618904f68c972d347a8" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.309790 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:58:37 crc kubenswrapper[4810]: E1003 08:58:37.310153 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.355346 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.367845 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.396485 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.398114 4810 scope.go:117] "RemoveContainer" containerID="0e2289e4f54b2cfcda193967b88c3de227c909659278751d2b3eadc554bed33f" Oct 03 08:58:37 crc kubenswrapper[4810]: E1003 08:58:37.400129 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.400169 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api" Oct 03 08:58:37 crc kubenswrapper[4810]: E1003 08:58:37.400204 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api-log" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.400214 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api-log" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.400700 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.400739 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" containerName="cinder-api-log" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.406956 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.415132 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.416956 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.417231 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.417380 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575085 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575155 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcfvc\" (UniqueName: \"kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575231 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575257 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575283 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575357 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575392 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575417 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.575441 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.676998 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcfvc\" (UniqueName: \"kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677151 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677176 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677250 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677280 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677306 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677330 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.677393 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.678309 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.678466 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.682773 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.683390 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.689385 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.689427 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.690350 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.691413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.703687 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcfvc\" (UniqueName: \"kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc\") pod \"cinder-api-0\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " pod="openstack/cinder-api-0" Oct 03 08:58:37 crc kubenswrapper[4810]: I1003 08:58:37.730166 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:58:38 crc kubenswrapper[4810]: I1003 08:58:38.355048 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:58:39 crc kubenswrapper[4810]: I1003 08:58:39.332335 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b47792-9b35-4012-9c8b-cf6a23539bb2" path="/var/lib/kubelet/pods/60b47792-9b35-4012-9c8b-cf6a23539bb2/volumes" Oct 03 08:58:39 crc kubenswrapper[4810]: I1003 08:58:39.349543 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerStarted","Data":"efbbea635fd7c2512387b085e21c476168a7ee9b5fafe68ca05dc78b78d49a5c"} Oct 03 08:58:39 crc kubenswrapper[4810]: I1003 08:58:39.349591 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerStarted","Data":"6d50228b624939992d60712869eaa0248b1a78dbd9158d13f41f238c53adc4c4"} Oct 03 08:58:40 crc kubenswrapper[4810]: I1003 08:58:40.363011 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerStarted","Data":"be1d419bdcaddfa86e01ac0347a626d35490d6b41d2b7976a512edfe5809af3c"} Oct 03 08:58:40 crc kubenswrapper[4810]: I1003 08:58:40.364790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:58:40 crc kubenswrapper[4810]: I1003 08:58:40.395726 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.395704755 podStartE2EDuration="3.395704755s" podCreationTimestamp="2025-10-03 08:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:40.385155283 +0000 UTC m=+7353.812406018" watchObservedRunningTime="2025-10-03 08:58:40.395704755 +0000 UTC m=+7353.822955500" Oct 03 08:58:41 crc kubenswrapper[4810]: I1003 08:58:41.646770 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 08:58:41 crc kubenswrapper[4810]: I1003 08:58:41.705089 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:42 crc kubenswrapper[4810]: I1003 08:58:42.381163 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="cinder-scheduler" containerID="cri-o://fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f" gracePeriod=30 Oct 03 08:58:42 crc kubenswrapper[4810]: I1003 08:58:42.381323 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="probe" containerID="cri-o://59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59" gracePeriod=30 Oct 03 08:58:43 crc kubenswrapper[4810]: I1003 08:58:43.396583 4810 generic.go:334] "Generic (PLEG): container finished" podID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerID="59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59" exitCode=0 Oct 03 08:58:43 crc kubenswrapper[4810]: I1003 08:58:43.396644 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerDied","Data":"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59"} Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.087839 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212198 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212273 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn7gq\" (UniqueName: \"kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212369 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212547 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.212574 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom\") pod \"e5801380-6f95-4fb7-ad68-e517b7d5b277\" (UID: \"e5801380-6f95-4fb7-ad68-e517b7d5b277\") " Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.214101 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.219956 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts" (OuterVolumeSpecName: "scripts") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.220975 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.221162 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq" (OuterVolumeSpecName: "kube-api-access-kn7gq") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "kube-api-access-kn7gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.271387 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.314631 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5801380-6f95-4fb7-ad68-e517b7d5b277-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.314665 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn7gq\" (UniqueName: \"kubernetes.io/projected/e5801380-6f95-4fb7-ad68-e517b7d5b277-kube-api-access-kn7gq\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.314680 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.314691 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.314703 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.338162 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data" (OuterVolumeSpecName: "config-data") pod "e5801380-6f95-4fb7-ad68-e517b7d5b277" (UID: "e5801380-6f95-4fb7-ad68-e517b7d5b277"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.410888 4810 generic.go:334] "Generic (PLEG): container finished" podID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerID="fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f" exitCode=0 Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.410929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerDied","Data":"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f"} Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.410947 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.410997 4810 scope.go:117] "RemoveContainer" containerID="59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.410982 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e5801380-6f95-4fb7-ad68-e517b7d5b277","Type":"ContainerDied","Data":"c7cfd157b8975bdd6899bd652dcf6cac49597aab16aec2a1f2f8d43e1a2f3fb4"} Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.416137 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5801380-6f95-4fb7-ad68-e517b7d5b277-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.460764 4810 scope.go:117] "RemoveContainer" containerID="fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.483949 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.496726 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.500116 4810 scope.go:117] "RemoveContainer" containerID="59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59" Oct 03 08:58:44 crc kubenswrapper[4810]: E1003 08:58:44.504070 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59\": container with ID starting with 59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59 not found: ID does not exist" containerID="59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.504117 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59"} err="failed to get container status \"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59\": rpc error: code = NotFound desc = could not find container \"59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59\": container with ID starting with 59460937626d04e964ff46b41c0ec35edeff1bf3fb369452079b64da9f55cf59 not found: ID does not exist" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.504144 4810 scope.go:117] "RemoveContainer" containerID="fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f" Oct 03 08:58:44 crc kubenswrapper[4810]: E1003 08:58:44.507500 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f\": container with ID starting with fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f not found: ID does not exist" containerID="fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.507544 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f"} err="failed to get container status \"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f\": rpc error: code = NotFound desc = could not find container \"fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f\": container with ID starting with fd0619dc0e97d8544699ea7d5f6164a297bc607a84c75198aca4f2c58573b98f not found: ID does not exist" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.524641 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:44 crc kubenswrapper[4810]: E1003 08:58:44.525149 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="cinder-scheduler" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.525167 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="cinder-scheduler" Oct 03 08:58:44 crc kubenswrapper[4810]: E1003 08:58:44.525190 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="probe" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.525196 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="probe" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.525357 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="cinder-scheduler" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.525391 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" containerName="probe" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.526451 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.534264 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.536720 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620177 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620588 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620687 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620713 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620736 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.620768 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk28d\" (UniqueName: \"kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723136 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723217 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723247 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723288 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk28d\" (UniqueName: \"kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723330 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723575 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.723625 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.728559 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.728847 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.729333 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.732864 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.741501 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk28d\" (UniqueName: \"kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d\") pod \"cinder-scheduler-0\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " pod="openstack/cinder-scheduler-0" Oct 03 08:58:44 crc kubenswrapper[4810]: I1003 08:58:44.902830 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:58:45 crc kubenswrapper[4810]: I1003 08:58:45.312991 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5801380-6f95-4fb7-ad68-e517b7d5b277" path="/var/lib/kubelet/pods/e5801380-6f95-4fb7-ad68-e517b7d5b277/volumes" Oct 03 08:58:45 crc kubenswrapper[4810]: I1003 08:58:45.373635 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:58:45 crc kubenswrapper[4810]: I1003 08:58:45.421064 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerStarted","Data":"0026cbfd4bc90a0bd014e4886b14e2539b70a78feaada6ac4f06b620a695d53a"} Oct 03 08:58:46 crc kubenswrapper[4810]: I1003 08:58:46.429930 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerStarted","Data":"60d21af689f98576a4bb81b1221ab2ccefd04ca65804bc0605a1a62f203b010e"} Oct 03 08:58:46 crc kubenswrapper[4810]: I1003 08:58:46.430198 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerStarted","Data":"592213b9cf15535204f9f07e1afe93eab58cb7c7e9c4720ea154fc4abf864d38"} Oct 03 08:58:46 crc kubenswrapper[4810]: I1003 08:58:46.461375 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.461350504 podStartE2EDuration="2.461350504s" podCreationTimestamp="2025-10-03 08:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:46.453394161 +0000 UTC m=+7359.880644906" watchObservedRunningTime="2025-10-03 08:58:46.461350504 +0000 UTC m=+7359.888601239" Oct 03 08:58:49 crc kubenswrapper[4810]: I1003 08:58:49.707124 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 08:58:49 crc kubenswrapper[4810]: I1003 08:58:49.903508 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:58:51 crc kubenswrapper[4810]: I1003 08:58:51.311251 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:58:51 crc kubenswrapper[4810]: E1003 08:58:51.311850 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 08:58:55 crc kubenswrapper[4810]: I1003 08:58:55.123705 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.742308 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-mrv6g"] Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.743976 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mrv6g" Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.750371 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mrv6g"] Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.800037 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl2qj\" (UniqueName: \"kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj\") pod \"glance-db-create-mrv6g\" (UID: \"161e5d26-127c-4659-a4a8-8d44c19f4c6d\") " pod="openstack/glance-db-create-mrv6g" Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.901929 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl2qj\" (UniqueName: \"kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj\") pod \"glance-db-create-mrv6g\" (UID: \"161e5d26-127c-4659-a4a8-8d44c19f4c6d\") " pod="openstack/glance-db-create-mrv6g" Oct 03 08:58:58 crc kubenswrapper[4810]: I1003 08:58:58.926525 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl2qj\" (UniqueName: \"kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj\") pod \"glance-db-create-mrv6g\" (UID: \"161e5d26-127c-4659-a4a8-8d44c19f4c6d\") " pod="openstack/glance-db-create-mrv6g" Oct 03 08:58:59 crc kubenswrapper[4810]: I1003 08:58:59.067208 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mrv6g" Oct 03 08:58:59 crc kubenswrapper[4810]: I1003 08:58:59.530550 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-mrv6g"] Oct 03 08:58:59 crc kubenswrapper[4810]: W1003 08:58:59.547246 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod161e5d26_127c_4659_a4a8_8d44c19f4c6d.slice/crio-61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844 WatchSource:0}: Error finding container 61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844: Status 404 returned error can't find the container with id 61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844 Oct 03 08:59:00 crc kubenswrapper[4810]: I1003 08:59:00.564074 4810 generic.go:334] "Generic (PLEG): container finished" podID="161e5d26-127c-4659-a4a8-8d44c19f4c6d" containerID="2d8466c62ce7e0b9940f0c67d89c783acd7ad0cbc026a4854d06e731b5f4d488" exitCode=0 Oct 03 08:59:00 crc kubenswrapper[4810]: I1003 08:59:00.564168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mrv6g" event={"ID":"161e5d26-127c-4659-a4a8-8d44c19f4c6d","Type":"ContainerDied","Data":"2d8466c62ce7e0b9940f0c67d89c783acd7ad0cbc026a4854d06e731b5f4d488"} Oct 03 08:59:00 crc kubenswrapper[4810]: I1003 08:59:00.565643 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mrv6g" event={"ID":"161e5d26-127c-4659-a4a8-8d44c19f4c6d","Type":"ContainerStarted","Data":"61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844"} Oct 03 08:59:01 crc kubenswrapper[4810]: I1003 08:59:01.923959 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mrv6g" Oct 03 08:59:01 crc kubenswrapper[4810]: I1003 08:59:01.956813 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl2qj\" (UniqueName: \"kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj\") pod \"161e5d26-127c-4659-a4a8-8d44c19f4c6d\" (UID: \"161e5d26-127c-4659-a4a8-8d44c19f4c6d\") " Oct 03 08:59:01 crc kubenswrapper[4810]: I1003 08:59:01.965401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj" (OuterVolumeSpecName: "kube-api-access-vl2qj") pod "161e5d26-127c-4659-a4a8-8d44c19f4c6d" (UID: "161e5d26-127c-4659-a4a8-8d44c19f4c6d"). InnerVolumeSpecName "kube-api-access-vl2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:02 crc kubenswrapper[4810]: I1003 08:59:02.060041 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl2qj\" (UniqueName: \"kubernetes.io/projected/161e5d26-127c-4659-a4a8-8d44c19f4c6d-kube-api-access-vl2qj\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:02 crc kubenswrapper[4810]: I1003 08:59:02.584763 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-mrv6g" event={"ID":"161e5d26-127c-4659-a4a8-8d44c19f4c6d","Type":"ContainerDied","Data":"61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844"} Oct 03 08:59:02 crc kubenswrapper[4810]: I1003 08:59:02.585126 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61aab21de1f67e7776a40e313f102b515eed9cf098b74328c709818fa40f9844" Oct 03 08:59:02 crc kubenswrapper[4810]: I1003 08:59:02.584811 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-mrv6g" Oct 03 08:59:04 crc kubenswrapper[4810]: I1003 08:59:04.302647 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 08:59:05 crc kubenswrapper[4810]: I1003 08:59:05.619489 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973"} Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.871918 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-35f0-account-create-7x5x8"] Oct 03 08:59:08 crc kubenswrapper[4810]: E1003 08:59:08.873181 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161e5d26-127c-4659-a4a8-8d44c19f4c6d" containerName="mariadb-database-create" Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.873205 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="161e5d26-127c-4659-a4a8-8d44c19f4c6d" containerName="mariadb-database-create" Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.873542 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="161e5d26-127c-4659-a4a8-8d44c19f4c6d" containerName="mariadb-database-create" Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.874605 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.877052 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.883533 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35f0-account-create-7x5x8"] Oct 03 08:59:08 crc kubenswrapper[4810]: I1003 08:59:08.995744 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h262s\" (UniqueName: \"kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s\") pod \"glance-35f0-account-create-7x5x8\" (UID: \"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932\") " pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:09 crc kubenswrapper[4810]: I1003 08:59:09.098301 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h262s\" (UniqueName: \"kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s\") pod \"glance-35f0-account-create-7x5x8\" (UID: \"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932\") " pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:09 crc kubenswrapper[4810]: I1003 08:59:09.120738 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h262s\" (UniqueName: \"kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s\") pod \"glance-35f0-account-create-7x5x8\" (UID: \"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932\") " pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:09 crc kubenswrapper[4810]: I1003 08:59:09.203209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:09 crc kubenswrapper[4810]: I1003 08:59:09.730573 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35f0-account-create-7x5x8"] Oct 03 08:59:09 crc kubenswrapper[4810]: W1003 08:59:09.737773 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b8af1d9_1a3d_40e2_b33a_7fa4eac7f932.slice/crio-29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49 WatchSource:0}: Error finding container 29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49: Status 404 returned error can't find the container with id 29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49 Oct 03 08:59:10 crc kubenswrapper[4810]: I1003 08:59:10.668280 4810 generic.go:334] "Generic (PLEG): container finished" podID="9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" containerID="a83b74c1210a375c8424e845c6ab6a049c9e6f21598bca2ba113f0fff9f45b9d" exitCode=0 Oct 03 08:59:10 crc kubenswrapper[4810]: I1003 08:59:10.668359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35f0-account-create-7x5x8" event={"ID":"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932","Type":"ContainerDied","Data":"a83b74c1210a375c8424e845c6ab6a049c9e6f21598bca2ba113f0fff9f45b9d"} Oct 03 08:59:10 crc kubenswrapper[4810]: I1003 08:59:10.668967 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35f0-account-create-7x5x8" event={"ID":"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932","Type":"ContainerStarted","Data":"29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49"} Oct 03 08:59:11 crc kubenswrapper[4810]: I1003 08:59:11.980937 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.054618 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h262s\" (UniqueName: \"kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s\") pod \"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932\" (UID: \"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932\") " Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.059873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s" (OuterVolumeSpecName: "kube-api-access-h262s") pod "9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" (UID: "9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932"). InnerVolumeSpecName "kube-api-access-h262s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.156420 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h262s\" (UniqueName: \"kubernetes.io/projected/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932-kube-api-access-h262s\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.688535 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35f0-account-create-7x5x8" event={"ID":"9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932","Type":"ContainerDied","Data":"29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49"} Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.688868 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29849c2668e83ee030e4d6825efa79cc7d5667aa556b17c5efa4f357c36eae49" Oct 03 08:59:12 crc kubenswrapper[4810]: I1003 08:59:12.688639 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35f0-account-create-7x5x8" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.926275 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-c4r7k"] Oct 03 08:59:13 crc kubenswrapper[4810]: E1003 08:59:13.926730 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" containerName="mariadb-account-create" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.926743 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" containerName="mariadb-account-create" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.926948 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" containerName="mariadb-account-create" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.927591 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.929433 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.931125 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4b2nh" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.958028 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c4r7k"] Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.991777 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssph4\" (UniqueName: \"kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.991865 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.991998 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:13 crc kubenswrapper[4810]: I1003 08:59:13.992037 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.093151 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssph4\" (UniqueName: \"kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.093232 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.093332 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.093370 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.099681 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.106161 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.109777 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.111249 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssph4\" (UniqueName: \"kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4\") pod \"glance-db-sync-c4r7k\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.258261 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:14 crc kubenswrapper[4810]: I1003 08:59:14.827766 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-c4r7k"] Oct 03 08:59:14 crc kubenswrapper[4810]: W1003 08:59:14.828372 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3091bacd_d5fb_4b1a_be28_1d3ae31ff392.slice/crio-119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf WatchSource:0}: Error finding container 119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf: Status 404 returned error can't find the container with id 119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf Oct 03 08:59:15 crc kubenswrapper[4810]: I1003 08:59:15.714595 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c4r7k" event={"ID":"3091bacd-d5fb-4b1a-be28-1d3ae31ff392","Type":"ContainerStarted","Data":"119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf"} Oct 03 08:59:31 crc kubenswrapper[4810]: I1003 08:59:31.866647 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c4r7k" event={"ID":"3091bacd-d5fb-4b1a-be28-1d3ae31ff392","Type":"ContainerStarted","Data":"1b9be289c09724a6a864c0593412c4c8e33195aca6ca4aaa5b91e450f9ef7bb2"} Oct 03 08:59:31 crc kubenswrapper[4810]: I1003 08:59:31.890185 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-c4r7k" podStartSLOduration=3.332153446 podStartE2EDuration="18.890166129s" podCreationTimestamp="2025-10-03 08:59:13 +0000 UTC" firstStartedPulling="2025-10-03 08:59:14.831175026 +0000 UTC m=+7388.258425771" lastFinishedPulling="2025-10-03 08:59:30.389187719 +0000 UTC m=+7403.816438454" observedRunningTime="2025-10-03 08:59:31.886394829 +0000 UTC m=+7405.313645574" watchObservedRunningTime="2025-10-03 08:59:31.890166129 +0000 UTC m=+7405.317416864" Oct 03 08:59:34 crc kubenswrapper[4810]: I1003 08:59:34.893552 4810 generic.go:334] "Generic (PLEG): container finished" podID="3091bacd-d5fb-4b1a-be28-1d3ae31ff392" containerID="1b9be289c09724a6a864c0593412c4c8e33195aca6ca4aaa5b91e450f9ef7bb2" exitCode=0 Oct 03 08:59:34 crc kubenswrapper[4810]: I1003 08:59:34.893657 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c4r7k" event={"ID":"3091bacd-d5fb-4b1a-be28-1d3ae31ff392","Type":"ContainerDied","Data":"1b9be289c09724a6a864c0593412c4c8e33195aca6ca4aaa5b91e450f9ef7bb2"} Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.278407 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.438375 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data\") pod \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.438518 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data\") pod \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.438545 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle\") pod \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.438617 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssph4\" (UniqueName: \"kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4\") pod \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\" (UID: \"3091bacd-d5fb-4b1a-be28-1d3ae31ff392\") " Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.445292 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3091bacd-d5fb-4b1a-be28-1d3ae31ff392" (UID: "3091bacd-d5fb-4b1a-be28-1d3ae31ff392"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.445714 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4" (OuterVolumeSpecName: "kube-api-access-ssph4") pod "3091bacd-d5fb-4b1a-be28-1d3ae31ff392" (UID: "3091bacd-d5fb-4b1a-be28-1d3ae31ff392"). InnerVolumeSpecName "kube-api-access-ssph4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.468826 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3091bacd-d5fb-4b1a-be28-1d3ae31ff392" (UID: "3091bacd-d5fb-4b1a-be28-1d3ae31ff392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.491227 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data" (OuterVolumeSpecName: "config-data") pod "3091bacd-d5fb-4b1a-be28-1d3ae31ff392" (UID: "3091bacd-d5fb-4b1a-be28-1d3ae31ff392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.540295 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.540339 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.540358 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssph4\" (UniqueName: \"kubernetes.io/projected/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-kube-api-access-ssph4\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.540372 4810 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3091bacd-d5fb-4b1a-be28-1d3ae31ff392-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.911652 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-c4r7k" event={"ID":"3091bacd-d5fb-4b1a-be28-1d3ae31ff392","Type":"ContainerDied","Data":"119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf"} Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.912233 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119ac676cdab4d5a5501a75b2ec62f08968066e37bcd8d66b688fee1c0b1d6bf" Oct 03 08:59:36 crc kubenswrapper[4810]: I1003 08:59:36.911727 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-c4r7k" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.214638 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:37 crc kubenswrapper[4810]: E1003 08:59:37.215096 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3091bacd-d5fb-4b1a-be28-1d3ae31ff392" containerName="glance-db-sync" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.215120 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3091bacd-d5fb-4b1a-be28-1d3ae31ff392" containerName="glance-db-sync" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.215353 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3091bacd-d5fb-4b1a-be28-1d3ae31ff392" containerName="glance-db-sync" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.216486 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.225043 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.227406 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.227467 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4b2nh" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.237960 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.347350 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.352457 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354275 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354323 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354387 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354455 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.354475 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww228\" (UniqueName: \"kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.361637 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.437288 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.441008 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.449884 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.457610 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463331 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463383 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdhv\" (UniqueName: \"kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463407 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p46nh\" (UniqueName: \"kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463429 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.463548 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.464544 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.465854 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.465994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466154 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466273 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466424 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466490 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466642 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466689 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466844 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.466948 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww228\" (UniqueName: \"kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.468066 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.475533 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.475723 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.478853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.509885 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww228\" (UniqueName: \"kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228\") pod \"glance-default-external-api-0\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.541511 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569329 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569414 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569453 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569480 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569515 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569534 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdhv\" (UniqueName: \"kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p46nh\" (UniqueName: \"kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.569648 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.570148 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.570531 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.571069 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.571098 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.571456 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.572017 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.576787 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.577548 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.582437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.591039 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p46nh\" (UniqueName: \"kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh\") pod \"dnsmasq-dns-59876f5b47-67882\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.591920 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdhv\" (UniqueName: \"kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv\") pod \"glance-default-internal-api-0\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.690551 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:37 crc kubenswrapper[4810]: I1003 08:59:37.765304 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.006948 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:38 crc kubenswrapper[4810]: W1003 08:59:38.012972 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a57192f_040f_44a5_a9b7_35e59ea09eed.slice/crio-94ff4c0ea4e489578e22a97af203b695ff9126d3f1931c1bf274558b12488222 WatchSource:0}: Error finding container 94ff4c0ea4e489578e22a97af203b695ff9126d3f1931c1bf274558b12488222: Status 404 returned error can't find the container with id 94ff4c0ea4e489578e22a97af203b695ff9126d3f1931c1bf274558b12488222 Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.300396 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 08:59:38 crc kubenswrapper[4810]: W1003 08:59:38.310001 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92e89a61_317b_4bf4_b312_e9840f68c983.slice/crio-397c6616edfd1055f0cd0927ec8f3e9623b29d76a12d638b93d4479f25e7d6b4 WatchSource:0}: Error finding container 397c6616edfd1055f0cd0927ec8f3e9623b29d76a12d638b93d4479f25e7d6b4: Status 404 returned error can't find the container with id 397c6616edfd1055f0cd0927ec8f3e9623b29d76a12d638b93d4479f25e7d6b4 Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.460127 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.497708 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:38 crc kubenswrapper[4810]: W1003 08:59:38.508312 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45922975_3112_4745_a96d_bdbbee34bfd4.slice/crio-63522ad16ff3cba677fc9e41e6e7c81ad22f72b8b288099423e4b0fe3936965f WatchSource:0}: Error finding container 63522ad16ff3cba677fc9e41e6e7c81ad22f72b8b288099423e4b0fe3936965f: Status 404 returned error can't find the container with id 63522ad16ff3cba677fc9e41e6e7c81ad22f72b8b288099423e4b0fe3936965f Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.955244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerStarted","Data":"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231"} Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.955695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerStarted","Data":"94ff4c0ea4e489578e22a97af203b695ff9126d3f1931c1bf274558b12488222"} Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.960275 4810 generic.go:334] "Generic (PLEG): container finished" podID="92e89a61-317b-4bf4-b312-e9840f68c983" containerID="29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7" exitCode=0 Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.961418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59876f5b47-67882" event={"ID":"92e89a61-317b-4bf4-b312-e9840f68c983","Type":"ContainerDied","Data":"29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7"} Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.961454 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59876f5b47-67882" event={"ID":"92e89a61-317b-4bf4-b312-e9840f68c983","Type":"ContainerStarted","Data":"397c6616edfd1055f0cd0927ec8f3e9623b29d76a12d638b93d4479f25e7d6b4"} Oct 03 08:59:38 crc kubenswrapper[4810]: I1003 08:59:38.973121 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerStarted","Data":"63522ad16ff3cba677fc9e41e6e7c81ad22f72b8b288099423e4b0fe3936965f"} Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.849262 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.983762 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59876f5b47-67882" event={"ID":"92e89a61-317b-4bf4-b312-e9840f68c983","Type":"ContainerStarted","Data":"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2"} Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.983814 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.985764 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerStarted","Data":"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32"} Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.985786 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerStarted","Data":"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b"} Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.988589 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerStarted","Data":"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0"} Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.988750 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-log" containerID="cri-o://ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" gracePeriod=30 Oct 03 08:59:39 crc kubenswrapper[4810]: I1003 08:59:39.988773 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-httpd" containerID="cri-o://1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" gracePeriod=30 Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.013376 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59876f5b47-67882" podStartSLOduration=3.013350666 podStartE2EDuration="3.013350666s" podCreationTimestamp="2025-10-03 08:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:40.00525975 +0000 UTC m=+7413.432510485" watchObservedRunningTime="2025-10-03 08:59:40.013350666 +0000 UTC m=+7413.440601421" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.026511 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.026484037 podStartE2EDuration="3.026484037s" podCreationTimestamp="2025-10-03 08:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:40.021480134 +0000 UTC m=+7413.448730869" watchObservedRunningTime="2025-10-03 08:59:40.026484037 +0000 UTC m=+7413.453734772" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.053136 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.053118939 podStartE2EDuration="3.053118939s" podCreationTimestamp="2025-10-03 08:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:40.046113182 +0000 UTC m=+7413.473363917" watchObservedRunningTime="2025-10-03 08:59:40.053118939 +0000 UTC m=+7413.480369674" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.679445 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.836442 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.836815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.836867 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.836954 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.837005 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.837054 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww228\" (UniqueName: \"kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228\") pod \"8a57192f-040f-44a5-a9b7-35e59ea09eed\" (UID: \"8a57192f-040f-44a5-a9b7-35e59ea09eed\") " Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.837309 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs" (OuterVolumeSpecName: "logs") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.837757 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.838098 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.838128 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a57192f-040f-44a5-a9b7-35e59ea09eed-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.846139 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts" (OuterVolumeSpecName: "scripts") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.856258 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228" (OuterVolumeSpecName: "kube-api-access-ww228") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "kube-api-access-ww228". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.878142 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.904911 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data" (OuterVolumeSpecName: "config-data") pod "8a57192f-040f-44a5-a9b7-35e59ea09eed" (UID: "8a57192f-040f-44a5-a9b7-35e59ea09eed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.941841 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.941876 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.941890 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a57192f-040f-44a5-a9b7-35e59ea09eed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:40 crc kubenswrapper[4810]: I1003 08:59:40.941915 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww228\" (UniqueName: \"kubernetes.io/projected/8a57192f-040f-44a5-a9b7-35e59ea09eed-kube-api-access-ww228\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.000982 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerID="1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" exitCode=0 Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.001020 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerID="ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" exitCode=143 Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.001228 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-log" containerID="cri-o://fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" gracePeriod=30 Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.001510 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.004006 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerDied","Data":"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0"} Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.004059 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerDied","Data":"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231"} Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.004076 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a57192f-040f-44a5-a9b7-35e59ea09eed","Type":"ContainerDied","Data":"94ff4c0ea4e489578e22a97af203b695ff9126d3f1931c1bf274558b12488222"} Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.004100 4810 scope.go:117] "RemoveContainer" containerID="1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.004123 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-httpd" containerID="cri-o://71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" gracePeriod=30 Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.039612 4810 scope.go:117] "RemoveContainer" containerID="ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.059942 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.066945 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.088777 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:41 crc kubenswrapper[4810]: E1003 08:59:41.089454 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-httpd" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.089480 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-httpd" Oct 03 08:59:41 crc kubenswrapper[4810]: E1003 08:59:41.089527 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-log" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.089535 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-log" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.089796 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-log" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.089829 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" containerName="glance-httpd" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.091334 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.094431 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.094555 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.097078 4810 scope.go:117] "RemoveContainer" containerID="1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" Oct 03 08:59:41 crc kubenswrapper[4810]: E1003 08:59:41.097569 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0\": container with ID starting with 1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0 not found: ID does not exist" containerID="1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.097609 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0"} err="failed to get container status \"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0\": rpc error: code = NotFound desc = could not find container \"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0\": container with ID starting with 1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0 not found: ID does not exist" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.097642 4810 scope.go:117] "RemoveContainer" containerID="ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" Oct 03 08:59:41 crc kubenswrapper[4810]: E1003 08:59:41.098257 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231\": container with ID starting with ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231 not found: ID does not exist" containerID="ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.098334 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231"} err="failed to get container status \"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231\": rpc error: code = NotFound desc = could not find container \"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231\": container with ID starting with ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231 not found: ID does not exist" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.098374 4810 scope.go:117] "RemoveContainer" containerID="1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.098889 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0"} err="failed to get container status \"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0\": rpc error: code = NotFound desc = could not find container \"1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0\": container with ID starting with 1ea0c58f64e5b6ada1f3748d4afcfe4be211764c342a4f48132009f5e01371d0 not found: ID does not exist" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.098941 4810 scope.go:117] "RemoveContainer" containerID="ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.099202 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231"} err="failed to get container status \"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231\": rpc error: code = NotFound desc = could not find container \"ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231\": container with ID starting with ac016f9e33dd169cd4d29e1f934fb3018a7e8affcc19bdfbcb8465ef8b9c9231 not found: ID does not exist" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.120869 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.148811 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149119 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149511 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149571 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149625 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.149786 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8cp\" (UniqueName: \"kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251634 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251698 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251797 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8cp\" (UniqueName: \"kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251875 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.251959 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.252006 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.252589 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.253042 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.256969 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.257101 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.257545 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.259312 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.268655 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8cp\" (UniqueName: \"kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp\") pod \"glance-default-external-api-0\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.315690 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a57192f-040f-44a5-a9b7-35e59ea09eed" path="/var/lib/kubelet/pods/8a57192f-040f-44a5-a9b7-35e59ea09eed/volumes" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.412740 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.687753 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.763539 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.763848 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.763987 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.764028 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.764060 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdhv\" (UniqueName: \"kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.764134 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run\") pod \"45922975-3112-4745-a96d-bdbbee34bfd4\" (UID: \"45922975-3112-4745-a96d-bdbbee34bfd4\") " Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.764864 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.765180 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs" (OuterVolumeSpecName: "logs") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.773795 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv" (OuterVolumeSpecName: "kube-api-access-mvdhv") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "kube-api-access-mvdhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.787963 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts" (OuterVolumeSpecName: "scripts") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.799323 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.817356 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data" (OuterVolumeSpecName: "config-data") pod "45922975-3112-4745-a96d-bdbbee34bfd4" (UID: "45922975-3112-4745-a96d-bdbbee34bfd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.864999 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.865033 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdhv\" (UniqueName: \"kubernetes.io/projected/45922975-3112-4745-a96d-bdbbee34bfd4-kube-api-access-mvdhv\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.865044 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.865055 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45922975-3112-4745-a96d-bdbbee34bfd4-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.865066 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.865076 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45922975-3112-4745-a96d-bdbbee34bfd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:41 crc kubenswrapper[4810]: I1003 08:59:41.976315 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:41 crc kubenswrapper[4810]: W1003 08:59:41.983120 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda116c862_90e2_44c5_834f_e273910f1297.slice/crio-bf2216e0769a6770980ba2269e212db32a76d17921283f2e2428ff394316c1bb WatchSource:0}: Error finding container bf2216e0769a6770980ba2269e212db32a76d17921283f2e2428ff394316c1bb: Status 404 returned error can't find the container with id bf2216e0769a6770980ba2269e212db32a76d17921283f2e2428ff394316c1bb Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010161 4810 generic.go:334] "Generic (PLEG): container finished" podID="45922975-3112-4745-a96d-bdbbee34bfd4" containerID="71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" exitCode=0 Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010205 4810 generic.go:334] "Generic (PLEG): container finished" podID="45922975-3112-4745-a96d-bdbbee34bfd4" containerID="fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" exitCode=143 Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010221 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010286 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerDied","Data":"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32"} Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010367 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerDied","Data":"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b"} Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"45922975-3112-4745-a96d-bdbbee34bfd4","Type":"ContainerDied","Data":"63522ad16ff3cba677fc9e41e6e7c81ad22f72b8b288099423e4b0fe3936965f"} Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.010402 4810 scope.go:117] "RemoveContainer" containerID="71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.012190 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerStarted","Data":"bf2216e0769a6770980ba2269e212db32a76d17921283f2e2428ff394316c1bb"} Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.044790 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.046522 4810 scope.go:117] "RemoveContainer" containerID="fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.055052 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.067540 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:42 crc kubenswrapper[4810]: E1003 08:59:42.068019 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-log" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.068037 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-log" Oct 03 08:59:42 crc kubenswrapper[4810]: E1003 08:59:42.068081 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-httpd" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.068088 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-httpd" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.068289 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-httpd" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.068331 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" containerName="glance-log" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.069409 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.074967 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.075079 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.082188 4810 scope.go:117] "RemoveContainer" containerID="71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" Oct 03 08:59:42 crc kubenswrapper[4810]: E1003 08:59:42.082582 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32\": container with ID starting with 71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32 not found: ID does not exist" containerID="71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.082615 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32"} err="failed to get container status \"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32\": rpc error: code = NotFound desc = could not find container \"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32\": container with ID starting with 71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32 not found: ID does not exist" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.082637 4810 scope.go:117] "RemoveContainer" containerID="fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" Oct 03 08:59:42 crc kubenswrapper[4810]: E1003 08:59:42.082857 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b\": container with ID starting with fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b not found: ID does not exist" containerID="fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.082885 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b"} err="failed to get container status \"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b\": rpc error: code = NotFound desc = could not find container \"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b\": container with ID starting with fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b not found: ID does not exist" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.082909 4810 scope.go:117] "RemoveContainer" containerID="71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.083102 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32"} err="failed to get container status \"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32\": rpc error: code = NotFound desc = could not find container \"71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32\": container with ID starting with 71560fcbccfab87a41e117ce5d34d2b84de57df18dca38afe8d494d721b12a32 not found: ID does not exist" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.083124 4810 scope.go:117] "RemoveContainer" containerID="fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.085741 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b"} err="failed to get container status \"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b\": rpc error: code = NotFound desc = could not find container \"fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b\": container with ID starting with fa2f39ebc707b0f64984e6623ee1ebf3abac99aa7806e937cae406131a7d721b not found: ID does not exist" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.087418 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj27x\" (UniqueName: \"kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174185 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174291 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174398 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174463 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.174489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.275997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276101 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276138 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276167 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj27x\" (UniqueName: \"kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276194 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276213 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.276279 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.277143 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.277385 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.281072 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.282105 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.282293 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.283346 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.298484 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj27x\" (UniqueName: \"kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x\") pod \"glance-default-internal-api-0\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.387860 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:42 crc kubenswrapper[4810]: I1003 08:59:42.895747 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:43 crc kubenswrapper[4810]: I1003 08:59:43.041217 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerStarted","Data":"503d86e98c9406e6b5448ab44f0b4b03255e58a9694d523acb5adcf474471755"} Oct 03 08:59:43 crc kubenswrapper[4810]: I1003 08:59:43.045074 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerStarted","Data":"da6d36cfa7b7b637f4e73ae57423fd756614536020cb3a1b0901b43424f13fad"} Oct 03 08:59:43 crc kubenswrapper[4810]: I1003 08:59:43.319518 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45922975-3112-4745-a96d-bdbbee34bfd4" path="/var/lib/kubelet/pods/45922975-3112-4745-a96d-bdbbee34bfd4/volumes" Oct 03 08:59:44 crc kubenswrapper[4810]: I1003 08:59:44.064676 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerStarted","Data":"273e1df6ad2e134547d4a271c43908bec4a5fac4bedce1850f5c497cd1349951"} Oct 03 08:59:44 crc kubenswrapper[4810]: I1003 08:59:44.067576 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerStarted","Data":"627170b69762133e49cc01416448d4326d9f790d64b22049e882b367a2baf91c"} Oct 03 08:59:44 crc kubenswrapper[4810]: I1003 08:59:44.092491 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.092460753 podStartE2EDuration="3.092460753s" podCreationTimestamp="2025-10-03 08:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:44.086663448 +0000 UTC m=+7417.513914183" watchObservedRunningTime="2025-10-03 08:59:44.092460753 +0000 UTC m=+7417.519711488" Oct 03 08:59:45 crc kubenswrapper[4810]: I1003 08:59:45.080155 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerStarted","Data":"04289b71777a3d321c84f88730195ef7e5009fbba18c93816fd6e2925f8793df"} Oct 03 08:59:45 crc kubenswrapper[4810]: I1003 08:59:45.110988 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.110958033 podStartE2EDuration="3.110958033s" podCreationTimestamp="2025-10-03 08:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:45.098408998 +0000 UTC m=+7418.525659763" watchObservedRunningTime="2025-10-03 08:59:45.110958033 +0000 UTC m=+7418.538208788" Oct 03 08:59:47 crc kubenswrapper[4810]: I1003 08:59:47.693297 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 08:59:47 crc kubenswrapper[4810]: I1003 08:59:47.762980 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:59:47 crc kubenswrapper[4810]: I1003 08:59:47.763247 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="dnsmasq-dns" containerID="cri-o://81d66d489d309b6cbd778c624c7816650200623b0f4c2bf3a68ef11fdf934462" gracePeriod=10 Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.115980 4810 generic.go:334] "Generic (PLEG): container finished" podID="afb6bb02-2adf-4187-9461-282a461bfd23" containerID="81d66d489d309b6cbd778c624c7816650200623b0f4c2bf3a68ef11fdf934462" exitCode=0 Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.116042 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" event={"ID":"afb6bb02-2adf-4187-9461-282a461bfd23","Type":"ContainerDied","Data":"81d66d489d309b6cbd778c624c7816650200623b0f4c2bf3a68ef11fdf934462"} Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.245976 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.413985 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb\") pod \"afb6bb02-2adf-4187-9461-282a461bfd23\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.414208 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb\") pod \"afb6bb02-2adf-4187-9461-282a461bfd23\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.414257 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl8tp\" (UniqueName: \"kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp\") pod \"afb6bb02-2adf-4187-9461-282a461bfd23\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.414280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc\") pod \"afb6bb02-2adf-4187-9461-282a461bfd23\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.414330 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config\") pod \"afb6bb02-2adf-4187-9461-282a461bfd23\" (UID: \"afb6bb02-2adf-4187-9461-282a461bfd23\") " Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.420876 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp" (OuterVolumeSpecName: "kube-api-access-zl8tp") pod "afb6bb02-2adf-4187-9461-282a461bfd23" (UID: "afb6bb02-2adf-4187-9461-282a461bfd23"). InnerVolumeSpecName "kube-api-access-zl8tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.473864 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config" (OuterVolumeSpecName: "config") pod "afb6bb02-2adf-4187-9461-282a461bfd23" (UID: "afb6bb02-2adf-4187-9461-282a461bfd23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.474047 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "afb6bb02-2adf-4187-9461-282a461bfd23" (UID: "afb6bb02-2adf-4187-9461-282a461bfd23"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.475460 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "afb6bb02-2adf-4187-9461-282a461bfd23" (UID: "afb6bb02-2adf-4187-9461-282a461bfd23"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.486383 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "afb6bb02-2adf-4187-9461-282a461bfd23" (UID: "afb6bb02-2adf-4187-9461-282a461bfd23"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.516863 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.516915 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl8tp\" (UniqueName: \"kubernetes.io/projected/afb6bb02-2adf-4187-9461-282a461bfd23-kube-api-access-zl8tp\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.516932 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.516942 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:48 crc kubenswrapper[4810]: I1003 08:59:48.516954 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afb6bb02-2adf-4187-9461-282a461bfd23-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.127755 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" event={"ID":"afb6bb02-2adf-4187-9461-282a461bfd23","Type":"ContainerDied","Data":"e499fc2e695078f457daa0046493268b670c7fb2dde7b7b361b452534058adc9"} Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.127824 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b5588555-t7ffc" Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.127833 4810 scope.go:117] "RemoveContainer" containerID="81d66d489d309b6cbd778c624c7816650200623b0f4c2bf3a68ef11fdf934462" Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.167983 4810 scope.go:117] "RemoveContainer" containerID="312bac1b8a387e387f2de5c3d4db9bb82681f68f4799b5e3e4809f40b7ed6d90" Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.169651 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.178588 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b5588555-t7ffc"] Oct 03 08:59:49 crc kubenswrapper[4810]: I1003 08:59:49.313790 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" path="/var/lib/kubelet/pods/afb6bb02-2adf-4187-9461-282a461bfd23/volumes" Oct 03 08:59:51 crc kubenswrapper[4810]: I1003 08:59:51.414738 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4810]: I1003 08:59:51.416152 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4810]: I1003 08:59:51.451669 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4810]: I1003 08:59:51.478306 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.162658 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.162761 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.388141 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.388228 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.430670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4810]: I1003 08:59:52.438982 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:53 crc kubenswrapper[4810]: I1003 08:59:53.174723 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:53 crc kubenswrapper[4810]: I1003 08:59:53.176367 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:54 crc kubenswrapper[4810]: I1003 08:59:54.364177 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:59:54 crc kubenswrapper[4810]: I1003 08:59:54.365668 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:59:54 crc kubenswrapper[4810]: I1003 08:59:54.543918 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:59:55 crc kubenswrapper[4810]: I1003 08:59:55.298452 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:55 crc kubenswrapper[4810]: I1003 08:59:55.299075 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:59:55 crc kubenswrapper[4810]: I1003 08:59:55.395026 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.151843 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59"] Oct 03 09:00:00 crc kubenswrapper[4810]: E1003 09:00:00.152974 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="init" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.152994 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="init" Oct 03 09:00:00 crc kubenswrapper[4810]: E1003 09:00:00.153018 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="dnsmasq-dns" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.153026 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="dnsmasq-dns" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.153234 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb6bb02-2adf-4187-9461-282a461bfd23" containerName="dnsmasq-dns" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.156962 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.171315 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.174478 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.176100 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59"] Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.196425 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5fg\" (UniqueName: \"kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.202026 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.202937 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.307150 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.307338 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl5fg\" (UniqueName: \"kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.307371 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.308995 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.322037 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.326651 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl5fg\" (UniqueName: \"kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg\") pod \"collect-profiles-29324700-2dd59\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:00 crc kubenswrapper[4810]: I1003 09:00:00.508336 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:00.999464 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59"] Oct 03 09:00:01 crc kubenswrapper[4810]: W1003 09:00:01.005201 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169fe661_e53e_4de6_a570_90e93d01bfa5.slice/crio-ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6 WatchSource:0}: Error finding container ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6: Status 404 returned error can't find the container with id ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6 Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.267758 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" event={"ID":"169fe661-e53e-4de6-a570-90e93d01bfa5","Type":"ContainerStarted","Data":"239b7dc7c6a2c32406017c542acde31c2f5cd1371462039c395771b820606a77"} Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.268133 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" event={"ID":"169fe661-e53e-4de6-a570-90e93d01bfa5","Type":"ContainerStarted","Data":"ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6"} Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.291581 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" podStartSLOduration=1.29154228 podStartE2EDuration="1.29154228s" podCreationTimestamp="2025-10-03 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:01.284407589 +0000 UTC m=+7434.711658344" watchObservedRunningTime="2025-10-03 09:00:01.29154228 +0000 UTC m=+7434.718793015" Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.843716 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lxxxj"] Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.845008 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.855391 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lxxxj"] Oct 03 09:00:01 crc kubenswrapper[4810]: I1003 09:00:01.942594 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlj7d\" (UniqueName: \"kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d\") pod \"placement-db-create-lxxxj\" (UID: \"d95e2e2a-9340-4300-ac33-887767f7e5cf\") " pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.045896 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlj7d\" (UniqueName: \"kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d\") pod \"placement-db-create-lxxxj\" (UID: \"d95e2e2a-9340-4300-ac33-887767f7e5cf\") " pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.069721 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlj7d\" (UniqueName: \"kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d\") pod \"placement-db-create-lxxxj\" (UID: \"d95e2e2a-9340-4300-ac33-887767f7e5cf\") " pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.162896 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.281060 4810 generic.go:334] "Generic (PLEG): container finished" podID="169fe661-e53e-4de6-a570-90e93d01bfa5" containerID="239b7dc7c6a2c32406017c542acde31c2f5cd1371462039c395771b820606a77" exitCode=0 Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.281103 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" event={"ID":"169fe661-e53e-4de6-a570-90e93d01bfa5","Type":"ContainerDied","Data":"239b7dc7c6a2c32406017c542acde31c2f5cd1371462039c395771b820606a77"} Oct 03 09:00:02 crc kubenswrapper[4810]: I1003 09:00:02.717419 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lxxxj"] Oct 03 09:00:02 crc kubenswrapper[4810]: W1003 09:00:02.722086 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd95e2e2a_9340_4300_ac33_887767f7e5cf.slice/crio-9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e WatchSource:0}: Error finding container 9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e: Status 404 returned error can't find the container with id 9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.298753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lxxxj" event={"ID":"d95e2e2a-9340-4300-ac33-887767f7e5cf","Type":"ContainerStarted","Data":"a1a04de7847d46fbcd88ab5dbf418d466bbf53cbbc2ab4300ab5479a4f000cdb"} Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.299362 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lxxxj" event={"ID":"d95e2e2a-9340-4300-ac33-887767f7e5cf","Type":"ContainerStarted","Data":"9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e"} Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.338662 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-lxxxj" podStartSLOduration=2.33864123 podStartE2EDuration="2.33864123s" podCreationTimestamp="2025-10-03 09:00:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:03.323552667 +0000 UTC m=+7436.750803422" watchObservedRunningTime="2025-10-03 09:00:03.33864123 +0000 UTC m=+7436.765891965" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.682153 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.787596 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume\") pod \"169fe661-e53e-4de6-a570-90e93d01bfa5\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.788374 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume\") pod \"169fe661-e53e-4de6-a570-90e93d01bfa5\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.788511 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl5fg\" (UniqueName: \"kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg\") pod \"169fe661-e53e-4de6-a570-90e93d01bfa5\" (UID: \"169fe661-e53e-4de6-a570-90e93d01bfa5\") " Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.789274 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume" (OuterVolumeSpecName: "config-volume") pod "169fe661-e53e-4de6-a570-90e93d01bfa5" (UID: "169fe661-e53e-4de6-a570-90e93d01bfa5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.794696 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "169fe661-e53e-4de6-a570-90e93d01bfa5" (UID: "169fe661-e53e-4de6-a570-90e93d01bfa5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.794703 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg" (OuterVolumeSpecName: "kube-api-access-gl5fg") pod "169fe661-e53e-4de6-a570-90e93d01bfa5" (UID: "169fe661-e53e-4de6-a570-90e93d01bfa5"). InnerVolumeSpecName "kube-api-access-gl5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.891560 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/169fe661-e53e-4de6-a570-90e93d01bfa5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.892446 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/169fe661-e53e-4de6-a570-90e93d01bfa5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4810]: I1003 09:00:03.892540 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl5fg\" (UniqueName: \"kubernetes.io/projected/169fe661-e53e-4de6-a570-90e93d01bfa5-kube-api-access-gl5fg\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.315980 4810 generic.go:334] "Generic (PLEG): container finished" podID="d95e2e2a-9340-4300-ac33-887767f7e5cf" containerID="a1a04de7847d46fbcd88ab5dbf418d466bbf53cbbc2ab4300ab5479a4f000cdb" exitCode=0 Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.316161 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lxxxj" event={"ID":"d95e2e2a-9340-4300-ac33-887767f7e5cf","Type":"ContainerDied","Data":"a1a04de7847d46fbcd88ab5dbf418d466bbf53cbbc2ab4300ab5479a4f000cdb"} Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.318816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" event={"ID":"169fe661-e53e-4de6-a570-90e93d01bfa5","Type":"ContainerDied","Data":"ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6"} Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.318871 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff021c5d989cd8c086bc966a16e6c47f797da29879b4318302d7a001c3cf56a6" Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.319044 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-2dd59" Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.386572 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn"] Oct 03 09:00:04 crc kubenswrapper[4810]: I1003 09:00:04.396640 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324655-mpsfn"] Oct 03 09:00:05 crc kubenswrapper[4810]: I1003 09:00:05.317298 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c14d633-6153-4c93-acaf-e8220d5e601b" path="/var/lib/kubelet/pods/9c14d633-6153-4c93-acaf-e8220d5e601b/volumes" Oct 03 09:00:05 crc kubenswrapper[4810]: I1003 09:00:05.707688 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:05 crc kubenswrapper[4810]: I1003 09:00:05.833837 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlj7d\" (UniqueName: \"kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d\") pod \"d95e2e2a-9340-4300-ac33-887767f7e5cf\" (UID: \"d95e2e2a-9340-4300-ac33-887767f7e5cf\") " Oct 03 09:00:05 crc kubenswrapper[4810]: I1003 09:00:05.839524 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d" (OuterVolumeSpecName: "kube-api-access-xlj7d") pod "d95e2e2a-9340-4300-ac33-887767f7e5cf" (UID: "d95e2e2a-9340-4300-ac33-887767f7e5cf"). InnerVolumeSpecName "kube-api-access-xlj7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:05 crc kubenswrapper[4810]: I1003 09:00:05.936913 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlj7d\" (UniqueName: \"kubernetes.io/projected/d95e2e2a-9340-4300-ac33-887767f7e5cf-kube-api-access-xlj7d\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:06 crc kubenswrapper[4810]: I1003 09:00:06.342861 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lxxxj" event={"ID":"d95e2e2a-9340-4300-ac33-887767f7e5cf","Type":"ContainerDied","Data":"9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e"} Oct 03 09:00:06 crc kubenswrapper[4810]: I1003 09:00:06.342937 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9989399f0ff593fa90134cb3929f9ea5bca25feb13b0c589df7638fc783ee44e" Oct 03 09:00:06 crc kubenswrapper[4810]: I1003 09:00:06.342946 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lxxxj" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.952458 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bfbb-account-create-q9qdc"] Oct 03 09:00:11 crc kubenswrapper[4810]: E1003 09:00:11.954804 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95e2e2a-9340-4300-ac33-887767f7e5cf" containerName="mariadb-database-create" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.955290 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95e2e2a-9340-4300-ac33-887767f7e5cf" containerName="mariadb-database-create" Oct 03 09:00:11 crc kubenswrapper[4810]: E1003 09:00:11.955421 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="169fe661-e53e-4de6-a570-90e93d01bfa5" containerName="collect-profiles" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.955520 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="169fe661-e53e-4de6-a570-90e93d01bfa5" containerName="collect-profiles" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.955933 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95e2e2a-9340-4300-ac33-887767f7e5cf" containerName="mariadb-database-create" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.956057 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="169fe661-e53e-4de6-a570-90e93d01bfa5" containerName="collect-profiles" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.956991 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.959630 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.965440 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bfbb-account-create-q9qdc"] Oct 03 09:00:11 crc kubenswrapper[4810]: I1003 09:00:11.991186 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x5md\" (UniqueName: \"kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md\") pod \"placement-bfbb-account-create-q9qdc\" (UID: \"65c121bd-7601-4845-8b51-c72706f78cbe\") " pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:12 crc kubenswrapper[4810]: I1003 09:00:12.096227 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x5md\" (UniqueName: \"kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md\") pod \"placement-bfbb-account-create-q9qdc\" (UID: \"65c121bd-7601-4845-8b51-c72706f78cbe\") " pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:12 crc kubenswrapper[4810]: I1003 09:00:12.129362 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x5md\" (UniqueName: \"kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md\") pod \"placement-bfbb-account-create-q9qdc\" (UID: \"65c121bd-7601-4845-8b51-c72706f78cbe\") " pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:12 crc kubenswrapper[4810]: I1003 09:00:12.276910 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:12 crc kubenswrapper[4810]: I1003 09:00:12.769794 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bfbb-account-create-q9qdc"] Oct 03 09:00:13 crc kubenswrapper[4810]: I1003 09:00:13.458625 4810 generic.go:334] "Generic (PLEG): container finished" podID="65c121bd-7601-4845-8b51-c72706f78cbe" containerID="0f21340868b9429edd2728b912b4f49b1e34b40140f6b795bd7c4f57f73d6f69" exitCode=0 Oct 03 09:00:13 crc kubenswrapper[4810]: I1003 09:00:13.459048 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bfbb-account-create-q9qdc" event={"ID":"65c121bd-7601-4845-8b51-c72706f78cbe","Type":"ContainerDied","Data":"0f21340868b9429edd2728b912b4f49b1e34b40140f6b795bd7c4f57f73d6f69"} Oct 03 09:00:13 crc kubenswrapper[4810]: I1003 09:00:13.459387 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bfbb-account-create-q9qdc" event={"ID":"65c121bd-7601-4845-8b51-c72706f78cbe","Type":"ContainerStarted","Data":"5a76cd306a3465b7f840045419ace5eb9b228d8a0f88b20404f9b4028dc2dfc3"} Oct 03 09:00:14 crc kubenswrapper[4810]: I1003 09:00:14.849461 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:14 crc kubenswrapper[4810]: I1003 09:00:14.960966 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x5md\" (UniqueName: \"kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md\") pod \"65c121bd-7601-4845-8b51-c72706f78cbe\" (UID: \"65c121bd-7601-4845-8b51-c72706f78cbe\") " Oct 03 09:00:14 crc kubenswrapper[4810]: I1003 09:00:14.968216 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md" (OuterVolumeSpecName: "kube-api-access-8x5md") pod "65c121bd-7601-4845-8b51-c72706f78cbe" (UID: "65c121bd-7601-4845-8b51-c72706f78cbe"). InnerVolumeSpecName "kube-api-access-8x5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:15 crc kubenswrapper[4810]: I1003 09:00:15.063870 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x5md\" (UniqueName: \"kubernetes.io/projected/65c121bd-7601-4845-8b51-c72706f78cbe-kube-api-access-8x5md\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:15 crc kubenswrapper[4810]: I1003 09:00:15.481981 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bfbb-account-create-q9qdc" event={"ID":"65c121bd-7601-4845-8b51-c72706f78cbe","Type":"ContainerDied","Data":"5a76cd306a3465b7f840045419ace5eb9b228d8a0f88b20404f9b4028dc2dfc3"} Oct 03 09:00:15 crc kubenswrapper[4810]: I1003 09:00:15.482077 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bfbb-account-create-q9qdc" Oct 03 09:00:15 crc kubenswrapper[4810]: I1003 09:00:15.482069 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a76cd306a3465b7f840045419ace5eb9b228d8a0f88b20404f9b4028dc2dfc3" Oct 03 09:00:16 crc kubenswrapper[4810]: I1003 09:00:16.714847 4810 scope.go:117] "RemoveContainer" containerID="29e1c159ec9e5d74ee70771b08953deee185a0b89e454f44ac410ea51d656962" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.204456 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:00:17 crc kubenswrapper[4810]: E1003 09:00:17.205994 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c121bd-7601-4845-8b51-c72706f78cbe" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.206024 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c121bd-7601-4845-8b51-c72706f78cbe" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.206251 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c121bd-7601-4845-8b51-c72706f78cbe" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.208040 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.219772 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hkm7m"] Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.221032 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.223492 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.223726 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.226538 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-b4tjt" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.232249 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.250383 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hkm7m"] Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318088 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318167 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318222 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f54m2\" (UniqueName: \"kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318262 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318314 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318347 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318393 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318426 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.318881 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdbz\" (UniqueName: \"kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420620 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420713 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420825 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdbz\" (UniqueName: \"kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420938 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420969 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.420997 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f54m2\" (UniqueName: \"kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421118 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421164 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421189 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421666 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.421919 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.422985 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.423101 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.423389 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.428345 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.435826 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.438509 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.440622 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f54m2\" (UniqueName: \"kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2\") pod \"placement-db-sync-hkm7m\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.445086 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdbz\" (UniqueName: \"kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz\") pod \"dnsmasq-dns-795b7f6797-7wt5c\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.536322 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:17 crc kubenswrapper[4810]: I1003 09:00:17.546929 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.122941 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.152499 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hkm7m"] Oct 03 09:00:18 crc kubenswrapper[4810]: W1003 09:00:18.152919 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fe7112b_a8c0_4c69_bb2a_6f322ec0ac38.slice/crio-0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b WatchSource:0}: Error finding container 0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b: Status 404 returned error can't find the container with id 0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.517123 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hkm7m" event={"ID":"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38","Type":"ContainerStarted","Data":"0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b"} Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.520749 4810 generic.go:334] "Generic (PLEG): container finished" podID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerID="96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd" exitCode=0 Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.520856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" event={"ID":"8357f38e-0c4b-40d3-9c8c-06a5b127d871","Type":"ContainerDied","Data":"96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd"} Oct 03 09:00:18 crc kubenswrapper[4810]: I1003 09:00:18.521194 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" event={"ID":"8357f38e-0c4b-40d3-9c8c-06a5b127d871","Type":"ContainerStarted","Data":"1685d82be7357912e0edae2c4754023c5d69c5232f5258f5b95506e786018740"} Oct 03 09:00:19 crc kubenswrapper[4810]: I1003 09:00:19.534544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" event={"ID":"8357f38e-0c4b-40d3-9c8c-06a5b127d871","Type":"ContainerStarted","Data":"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe"} Oct 03 09:00:19 crc kubenswrapper[4810]: I1003 09:00:19.534866 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:19 crc kubenswrapper[4810]: I1003 09:00:19.569106 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" podStartSLOduration=2.569075228 podStartE2EDuration="2.569075228s" podCreationTimestamp="2025-10-03 09:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:19.553220565 +0000 UTC m=+7452.980471290" watchObservedRunningTime="2025-10-03 09:00:19.569075228 +0000 UTC m=+7452.996325963" Oct 03 09:00:22 crc kubenswrapper[4810]: I1003 09:00:22.562489 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hkm7m" event={"ID":"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38","Type":"ContainerStarted","Data":"d701b87a14577f87b03e4fa3f6ca9e1ae7965d82f527c984c868c7194359f6a3"} Oct 03 09:00:22 crc kubenswrapper[4810]: I1003 09:00:22.592471 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hkm7m" podStartSLOduration=1.7375844950000001 podStartE2EDuration="5.59244596s" podCreationTimestamp="2025-10-03 09:00:17 +0000 UTC" firstStartedPulling="2025-10-03 09:00:18.164163515 +0000 UTC m=+7451.591414250" lastFinishedPulling="2025-10-03 09:00:22.01902498 +0000 UTC m=+7455.446275715" observedRunningTime="2025-10-03 09:00:22.586260394 +0000 UTC m=+7456.013511129" watchObservedRunningTime="2025-10-03 09:00:22.59244596 +0000 UTC m=+7456.019696685" Oct 03 09:00:24 crc kubenswrapper[4810]: I1003 09:00:24.583309 4810 generic.go:334] "Generic (PLEG): container finished" podID="7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" containerID="d701b87a14577f87b03e4fa3f6ca9e1ae7965d82f527c984c868c7194359f6a3" exitCode=0 Oct 03 09:00:24 crc kubenswrapper[4810]: I1003 09:00:24.583415 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hkm7m" event={"ID":"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38","Type":"ContainerDied","Data":"d701b87a14577f87b03e4fa3f6ca9e1ae7965d82f527c984c868c7194359f6a3"} Oct 03 09:00:25 crc kubenswrapper[4810]: I1003 09:00:25.934464 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.108780 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data\") pod \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.108911 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle\") pod \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.109065 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f54m2\" (UniqueName: \"kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2\") pod \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.109160 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs\") pod \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.109306 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts\") pod \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\" (UID: \"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38\") " Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.109513 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs" (OuterVolumeSpecName: "logs") pod "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" (UID: "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.109951 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.114850 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts" (OuterVolumeSpecName: "scripts") pod "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" (UID: "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.115000 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2" (OuterVolumeSpecName: "kube-api-access-f54m2") pod "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" (UID: "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38"). InnerVolumeSpecName "kube-api-access-f54m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.134732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" (UID: "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.135305 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data" (OuterVolumeSpecName: "config-data") pod "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" (UID: "7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.212305 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.212344 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.212358 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.212373 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f54m2\" (UniqueName: \"kubernetes.io/projected/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38-kube-api-access-f54m2\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.608086 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hkm7m" event={"ID":"7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38","Type":"ContainerDied","Data":"0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b"} Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.608616 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e6952e0210cbde86d04a80e27fe191f89901dae3765eee5ebf16ee7b66d7f4b" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.608162 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hkm7m" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.705797 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:00:26 crc kubenswrapper[4810]: E1003 09:00:26.706203 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" containerName="placement-db-sync" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.706220 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" containerName="placement-db-sync" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.706421 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" containerName="placement-db-sync" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.707412 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.715531 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.715576 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.715738 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-b4tjt" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.715873 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.716444 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.731836 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.824147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.824453 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.824563 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.824680 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.824869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbqkf\" (UniqueName: \"kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.825090 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.825216 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.926978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927124 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927188 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927217 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927252 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927281 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbqkf\" (UniqueName: \"kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.927538 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.933232 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.933538 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.939546 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.941178 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.946459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:26 crc kubenswrapper[4810]: I1003 09:00:26.947803 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbqkf\" (UniqueName: \"kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf\") pod \"placement-c96f8d7b-nv762\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.027701 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.486663 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.538670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.601820 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.602117 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59876f5b47-67882" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="dnsmasq-dns" containerID="cri-o://3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2" gracePeriod=10 Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.669342 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerStarted","Data":"dfad215d84e80e7be57e8b4eb52259b776b7a757a4e050faaf8e666cb4529c89"} Oct 03 09:00:27 crc kubenswrapper[4810]: I1003 09:00:27.697482 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59876f5b47-67882" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.75:5353: connect: connection refused" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.553006 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.671719 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config\") pod \"92e89a61-317b-4bf4-b312-e9840f68c983\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.671804 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb\") pod \"92e89a61-317b-4bf4-b312-e9840f68c983\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.671962 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p46nh\" (UniqueName: \"kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh\") pod \"92e89a61-317b-4bf4-b312-e9840f68c983\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.672043 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb\") pod \"92e89a61-317b-4bf4-b312-e9840f68c983\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.672137 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc\") pod \"92e89a61-317b-4bf4-b312-e9840f68c983\" (UID: \"92e89a61-317b-4bf4-b312-e9840f68c983\") " Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.676217 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh" (OuterVolumeSpecName: "kube-api-access-p46nh") pod "92e89a61-317b-4bf4-b312-e9840f68c983" (UID: "92e89a61-317b-4bf4-b312-e9840f68c983"). InnerVolumeSpecName "kube-api-access-p46nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.679836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerStarted","Data":"aafd95e2c2156c5d62c9e52e62267e00c8cc9c2c5da26c2f67f4a5b4754d0e28"} Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.679950 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerStarted","Data":"0438a7801eb086bb2f2988fbf6a2faaadb40a8b29e9157eea6ecfa708ea3864e"} Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.680145 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.680179 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.682312 4810 generic.go:334] "Generic (PLEG): container finished" podID="92e89a61-317b-4bf4-b312-e9840f68c983" containerID="3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2" exitCode=0 Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.682442 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59876f5b47-67882" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.682458 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59876f5b47-67882" event={"ID":"92e89a61-317b-4bf4-b312-e9840f68c983","Type":"ContainerDied","Data":"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2"} Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.682740 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59876f5b47-67882" event={"ID":"92e89a61-317b-4bf4-b312-e9840f68c983","Type":"ContainerDied","Data":"397c6616edfd1055f0cd0927ec8f3e9623b29d76a12d638b93d4479f25e7d6b4"} Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.682831 4810 scope.go:117] "RemoveContainer" containerID="3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.707070 4810 scope.go:117] "RemoveContainer" containerID="29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.710370 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c96f8d7b-nv762" podStartSLOduration=2.710348814 podStartE2EDuration="2.710348814s" podCreationTimestamp="2025-10-03 09:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:28.700388378 +0000 UTC m=+7462.127639143" watchObservedRunningTime="2025-10-03 09:00:28.710348814 +0000 UTC m=+7462.137599549" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.724337 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "92e89a61-317b-4bf4-b312-e9840f68c983" (UID: "92e89a61-317b-4bf4-b312-e9840f68c983"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.725864 4810 scope.go:117] "RemoveContainer" containerID="3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2" Oct 03 09:00:28 crc kubenswrapper[4810]: E1003 09:00:28.726460 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2\": container with ID starting with 3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2 not found: ID does not exist" containerID="3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.726496 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2"} err="failed to get container status \"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2\": rpc error: code = NotFound desc = could not find container \"3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2\": container with ID starting with 3ab6f7ecc7d8e1ec573f39e3912561d0b3301e785b9b16d16d8cbda3dcb291d2 not found: ID does not exist" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.726523 4810 scope.go:117] "RemoveContainer" containerID="29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7" Oct 03 09:00:28 crc kubenswrapper[4810]: E1003 09:00:28.726932 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7\": container with ID starting with 29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7 not found: ID does not exist" containerID="29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.726978 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7"} err="failed to get container status \"29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7\": rpc error: code = NotFound desc = could not find container \"29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7\": container with ID starting with 29cd88f87b1986a0cc4613b5f29d208be74a90660f300fca71d049f538d6a7d7 not found: ID does not exist" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.727250 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92e89a61-317b-4bf4-b312-e9840f68c983" (UID: "92e89a61-317b-4bf4-b312-e9840f68c983"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.735064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config" (OuterVolumeSpecName: "config") pod "92e89a61-317b-4bf4-b312-e9840f68c983" (UID: "92e89a61-317b-4bf4-b312-e9840f68c983"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.739555 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92e89a61-317b-4bf4-b312-e9840f68c983" (UID: "92e89a61-317b-4bf4-b312-e9840f68c983"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.774636 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.774671 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.774681 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.774689 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92e89a61-317b-4bf4-b312-e9840f68c983-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4810]: I1003 09:00:28.774699 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p46nh\" (UniqueName: \"kubernetes.io/projected/92e89a61-317b-4bf4-b312-e9840f68c983-kube-api-access-p46nh\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:29 crc kubenswrapper[4810]: I1003 09:00:29.046244 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 09:00:29 crc kubenswrapper[4810]: I1003 09:00:29.059951 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59876f5b47-67882"] Oct 03 09:00:29 crc kubenswrapper[4810]: I1003 09:00:29.323428 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" path="/var/lib/kubelet/pods/92e89a61-317b-4bf4-b312-e9840f68c983/volumes" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.529703 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:00:42 crc kubenswrapper[4810]: E1003 09:00:42.530756 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="dnsmasq-dns" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.530773 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="dnsmasq-dns" Oct 03 09:00:42 crc kubenswrapper[4810]: E1003 09:00:42.530795 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="init" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.530803 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="init" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.531029 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e89a61-317b-4bf4-b312-e9840f68c983" containerName="dnsmasq-dns" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.539863 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.547530 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.666904 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.666995 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.667189 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd6qv\" (UniqueName: \"kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.769445 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.769535 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.769651 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd6qv\" (UniqueName: \"kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.770218 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.770266 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.796404 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd6qv\" (UniqueName: \"kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv\") pod \"redhat-operators-2hsgx\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:42 crc kubenswrapper[4810]: I1003 09:00:42.882802 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:43 crc kubenswrapper[4810]: I1003 09:00:43.367782 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:00:43 crc kubenswrapper[4810]: W1003 09:00:43.382392 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e30aaa_5318_42e1_99ef_a08cc62fb8ee.slice/crio-2ec48941e8d85e6bbd03b1f248917b3e69266b0ec5b09b06016dd4d67233e3d3 WatchSource:0}: Error finding container 2ec48941e8d85e6bbd03b1f248917b3e69266b0ec5b09b06016dd4d67233e3d3: Status 404 returned error can't find the container with id 2ec48941e8d85e6bbd03b1f248917b3e69266b0ec5b09b06016dd4d67233e3d3 Oct 03 09:00:43 crc kubenswrapper[4810]: I1003 09:00:43.827819 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerStarted","Data":"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b"} Oct 03 09:00:43 crc kubenswrapper[4810]: I1003 09:00:43.828173 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerStarted","Data":"2ec48941e8d85e6bbd03b1f248917b3e69266b0ec5b09b06016dd4d67233e3d3"} Oct 03 09:00:44 crc kubenswrapper[4810]: I1003 09:00:44.839853 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerID="4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b" exitCode=0 Oct 03 09:00:44 crc kubenswrapper[4810]: I1003 09:00:44.840394 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerDied","Data":"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b"} Oct 03 09:00:46 crc kubenswrapper[4810]: I1003 09:00:46.867615 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerStarted","Data":"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2"} Oct 03 09:00:47 crc kubenswrapper[4810]: I1003 09:00:47.884505 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerID="e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2" exitCode=0 Oct 03 09:00:47 crc kubenswrapper[4810]: I1003 09:00:47.884625 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerDied","Data":"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2"} Oct 03 09:00:49 crc kubenswrapper[4810]: I1003 09:00:49.909200 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerStarted","Data":"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583"} Oct 03 09:00:49 crc kubenswrapper[4810]: I1003 09:00:49.939784 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2hsgx" podStartSLOduration=4.114652873 podStartE2EDuration="7.939751853s" podCreationTimestamp="2025-10-03 09:00:42 +0000 UTC" firstStartedPulling="2025-10-03 09:00:44.855880374 +0000 UTC m=+7478.283131149" lastFinishedPulling="2025-10-03 09:00:48.680979404 +0000 UTC m=+7482.108230129" observedRunningTime="2025-10-03 09:00:49.931225026 +0000 UTC m=+7483.358475771" watchObservedRunningTime="2025-10-03 09:00:49.939751853 +0000 UTC m=+7483.367002598" Oct 03 09:00:52 crc kubenswrapper[4810]: I1003 09:00:52.883438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:52 crc kubenswrapper[4810]: I1003 09:00:52.883826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:00:53 crc kubenswrapper[4810]: I1003 09:00:53.932808 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2hsgx" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="registry-server" probeResult="failure" output=< Oct 03 09:00:53 crc kubenswrapper[4810]: timeout: failed to connect service ":50051" within 1s Oct 03 09:00:53 crc kubenswrapper[4810]: > Oct 03 09:00:58 crc kubenswrapper[4810]: I1003 09:00:58.094023 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:00:58 crc kubenswrapper[4810]: I1003 09:00:58.095510 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.141487 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29324701-wvvbm"] Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.142711 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.151857 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-wvvbm"] Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.243488 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.243809 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.243857 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5rt\" (UniqueName: \"kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.243960 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.346144 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.346282 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.346318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.346351 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5rt\" (UniqueName: \"kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.352417 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.356146 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.364632 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.369346 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5rt\" (UniqueName: \"kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt\") pod \"keystone-cron-29324701-wvvbm\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.476295 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:00 crc kubenswrapper[4810]: I1003 09:01:00.948825 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-wvvbm"] Oct 03 09:01:01 crc kubenswrapper[4810]: I1003 09:01:01.000256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-wvvbm" event={"ID":"84fa4b6d-43d9-4b34-a264-80a71cfcdb50","Type":"ContainerStarted","Data":"2bfd2ac2ee8a32ebb699788ff777a9f82931333c8a930af084ec0a67a744c49b"} Oct 03 09:01:02 crc kubenswrapper[4810]: I1003 09:01:02.011939 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-wvvbm" event={"ID":"84fa4b6d-43d9-4b34-a264-80a71cfcdb50","Type":"ContainerStarted","Data":"4c55292548a4d03c22af14ea11a94e15075846c2b8747762ba1fb825a828816a"} Oct 03 09:01:02 crc kubenswrapper[4810]: I1003 09:01:02.044944 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29324701-wvvbm" podStartSLOduration=2.044917044 podStartE2EDuration="2.044917044s" podCreationTimestamp="2025-10-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:02.036337214 +0000 UTC m=+7495.463587969" watchObservedRunningTime="2025-10-03 09:01:02.044917044 +0000 UTC m=+7495.472167779" Oct 03 09:01:02 crc kubenswrapper[4810]: I1003 09:01:02.935393 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:01:03 crc kubenswrapper[4810]: I1003 09:01:03.007263 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:01:03 crc kubenswrapper[4810]: I1003 09:01:03.173745 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.038018 4810 generic.go:334] "Generic (PLEG): container finished" podID="84fa4b6d-43d9-4b34-a264-80a71cfcdb50" containerID="4c55292548a4d03c22af14ea11a94e15075846c2b8747762ba1fb825a828816a" exitCode=0 Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.038127 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-wvvbm" event={"ID":"84fa4b6d-43d9-4b34-a264-80a71cfcdb50","Type":"ContainerDied","Data":"4c55292548a4d03c22af14ea11a94e15075846c2b8747762ba1fb825a828816a"} Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.038727 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2hsgx" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="registry-server" containerID="cri-o://6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583" gracePeriod=2 Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.510489 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.638635 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities\") pod \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.639634 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content\") pod \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.639715 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd6qv\" (UniqueName: \"kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv\") pod \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\" (UID: \"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee\") " Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.639999 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities" (OuterVolumeSpecName: "utilities") pod "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" (UID: "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.640568 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.646832 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv" (OuterVolumeSpecName: "kube-api-access-xd6qv") pod "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" (UID: "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee"). InnerVolumeSpecName "kube-api-access-xd6qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.723005 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" (UID: "f0e30aaa-5318-42e1-99ef-a08cc62fb8ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.742279 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:04 crc kubenswrapper[4810]: I1003 09:01:04.742329 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd6qv\" (UniqueName: \"kubernetes.io/projected/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee-kube-api-access-xd6qv\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.052126 4810 generic.go:334] "Generic (PLEG): container finished" podID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerID="6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583" exitCode=0 Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.052194 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerDied","Data":"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583"} Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.052231 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2hsgx" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.052262 4810 scope.go:117] "RemoveContainer" containerID="6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.052246 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2hsgx" event={"ID":"f0e30aaa-5318-42e1-99ef-a08cc62fb8ee","Type":"ContainerDied","Data":"2ec48941e8d85e6bbd03b1f248917b3e69266b0ec5b09b06016dd4d67233e3d3"} Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.078840 4810 scope.go:117] "RemoveContainer" containerID="e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.095440 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.104697 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2hsgx"] Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.128066 4810 scope.go:117] "RemoveContainer" containerID="4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.168494 4810 scope.go:117] "RemoveContainer" containerID="6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583" Oct 03 09:01:05 crc kubenswrapper[4810]: E1003 09:01:05.169036 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583\": container with ID starting with 6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583 not found: ID does not exist" containerID="6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.169066 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583"} err="failed to get container status \"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583\": rpc error: code = NotFound desc = could not find container \"6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583\": container with ID starting with 6d1d7d7fa501e28bad06ac43c66d221517fea9f60d41d472c9755a9cfbbed583 not found: ID does not exist" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.169091 4810 scope.go:117] "RemoveContainer" containerID="e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2" Oct 03 09:01:05 crc kubenswrapper[4810]: E1003 09:01:05.170150 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2\": container with ID starting with e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2 not found: ID does not exist" containerID="e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.170178 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2"} err="failed to get container status \"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2\": rpc error: code = NotFound desc = could not find container \"e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2\": container with ID starting with e5cbd909d5f35301579b361ea89cd46ab08b9c6709749d214e1cc6a478d740c2 not found: ID does not exist" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.170197 4810 scope.go:117] "RemoveContainer" containerID="4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b" Oct 03 09:01:05 crc kubenswrapper[4810]: E1003 09:01:05.170856 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b\": container with ID starting with 4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b not found: ID does not exist" containerID="4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.170883 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b"} err="failed to get container status \"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b\": rpc error: code = NotFound desc = could not find container \"4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b\": container with ID starting with 4cea6bdd813025a77d66e54e963addf9e1e1e00ef6b721af41b4f5bad5cc084b not found: ID does not exist" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.317220 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" path="/var/lib/kubelet/pods/f0e30aaa-5318-42e1-99ef-a08cc62fb8ee/volumes" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.368592 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.462415 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn5rt\" (UniqueName: \"kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt\") pod \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.462973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys\") pod \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.463083 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data\") pod \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.463129 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle\") pod \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\" (UID: \"84fa4b6d-43d9-4b34-a264-80a71cfcdb50\") " Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.467788 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "84fa4b6d-43d9-4b34-a264-80a71cfcdb50" (UID: "84fa4b6d-43d9-4b34-a264-80a71cfcdb50"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.468212 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt" (OuterVolumeSpecName: "kube-api-access-mn5rt") pod "84fa4b6d-43d9-4b34-a264-80a71cfcdb50" (UID: "84fa4b6d-43d9-4b34-a264-80a71cfcdb50"). InnerVolumeSpecName "kube-api-access-mn5rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.490950 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84fa4b6d-43d9-4b34-a264-80a71cfcdb50" (UID: "84fa4b6d-43d9-4b34-a264-80a71cfcdb50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.515490 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data" (OuterVolumeSpecName: "config-data") pod "84fa4b6d-43d9-4b34-a264-80a71cfcdb50" (UID: "84fa4b6d-43d9-4b34-a264-80a71cfcdb50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.566155 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.566199 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.566213 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4810]: I1003 09:01:05.566225 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn5rt\" (UniqueName: \"kubernetes.io/projected/84fa4b6d-43d9-4b34-a264-80a71cfcdb50-kube-api-access-mn5rt\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:06 crc kubenswrapper[4810]: I1003 09:01:06.061233 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-wvvbm" event={"ID":"84fa4b6d-43d9-4b34-a264-80a71cfcdb50","Type":"ContainerDied","Data":"2bfd2ac2ee8a32ebb699788ff777a9f82931333c8a930af084ec0a67a744c49b"} Oct 03 09:01:06 crc kubenswrapper[4810]: I1003 09:01:06.061273 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bfd2ac2ee8a32ebb699788ff777a9f82931333c8a930af084ec0a67a744c49b" Oct 03 09:01:06 crc kubenswrapper[4810]: I1003 09:01:06.061281 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-wvvbm" Oct 03 09:01:13 crc kubenswrapper[4810]: I1003 09:01:13.042584 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lmhgp"] Oct 03 09:01:13 crc kubenswrapper[4810]: I1003 09:01:13.049463 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lmhgp"] Oct 03 09:01:13 crc kubenswrapper[4810]: I1003 09:01:13.313059 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66eb28ac-c56f-4700-9faa-85e3327c303b" path="/var/lib/kubelet/pods/66eb28ac-c56f-4700-9faa-85e3327c303b/volumes" Oct 03 09:01:16 crc kubenswrapper[4810]: I1003 09:01:16.869826 4810 scope.go:117] "RemoveContainer" containerID="16eb23acd136f97ac0bb85b390a4ff4943be9ff9d066f146c47f808111970b89" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.114112 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vxq2p"] Oct 03 09:01:18 crc kubenswrapper[4810]: E1003 09:01:18.114768 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84fa4b6d-43d9-4b34-a264-80a71cfcdb50" containerName="keystone-cron" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.114780 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fa4b6d-43d9-4b34-a264-80a71cfcdb50" containerName="keystone-cron" Oct 03 09:01:18 crc kubenswrapper[4810]: E1003 09:01:18.114801 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="extract-content" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.114807 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="extract-content" Oct 03 09:01:18 crc kubenswrapper[4810]: E1003 09:01:18.114818 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="registry-server" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.114824 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="registry-server" Oct 03 09:01:18 crc kubenswrapper[4810]: E1003 09:01:18.114835 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="extract-utilities" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.114841 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="extract-utilities" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.115043 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0e30aaa-5318-42e1-99ef-a08cc62fb8ee" containerName="registry-server" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.115058 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="84fa4b6d-43d9-4b34-a264-80a71cfcdb50" containerName="keystone-cron" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.115655 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.125600 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vxq2p"] Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.213730 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-s9mvg"] Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.215193 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.222807 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s9mvg"] Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.223229 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brfss\" (UniqueName: \"kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss\") pod \"nova-api-db-create-vxq2p\" (UID: \"1d118cd8-5e30-48cc-aee4-db15dfb65704\") " pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.325182 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcsw\" (UniqueName: \"kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw\") pod \"nova-cell0-db-create-s9mvg\" (UID: \"4b1790a9-927d-4618-9eb7-c5ced54af4a9\") " pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.325300 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brfss\" (UniqueName: \"kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss\") pod \"nova-api-db-create-vxq2p\" (UID: \"1d118cd8-5e30-48cc-aee4-db15dfb65704\") " pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.353652 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tfcdh"] Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.355460 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.379286 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brfss\" (UniqueName: \"kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss\") pod \"nova-api-db-create-vxq2p\" (UID: \"1d118cd8-5e30-48cc-aee4-db15dfb65704\") " pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.422737 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfcdh"] Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.428492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvv7f\" (UniqueName: \"kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f\") pod \"nova-cell1-db-create-tfcdh\" (UID: \"3e2d06bb-19a4-4aa4-aec7-783a567726db\") " pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.428607 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcsw\" (UniqueName: \"kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw\") pod \"nova-cell0-db-create-s9mvg\" (UID: \"4b1790a9-927d-4618-9eb7-c5ced54af4a9\") " pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.443048 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.481865 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcsw\" (UniqueName: \"kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw\") pod \"nova-cell0-db-create-s9mvg\" (UID: \"4b1790a9-927d-4618-9eb7-c5ced54af4a9\") " pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.533767 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvv7f\" (UniqueName: \"kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f\") pod \"nova-cell1-db-create-tfcdh\" (UID: \"3e2d06bb-19a4-4aa4-aec7-783a567726db\") " pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.537308 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.552770 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvv7f\" (UniqueName: \"kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f\") pod \"nova-cell1-db-create-tfcdh\" (UID: \"3e2d06bb-19a4-4aa4-aec7-783a567726db\") " pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:18 crc kubenswrapper[4810]: I1003 09:01:18.754495 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.007661 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vxq2p"] Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.077712 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-s9mvg"] Oct 03 09:01:19 crc kubenswrapper[4810]: W1003 09:01:19.082096 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b1790a9_927d_4618_9eb7_c5ced54af4a9.slice/crio-3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623 WatchSource:0}: Error finding container 3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623: Status 404 returned error can't find the container with id 3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623 Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.204870 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfcdh"] Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.225319 4810 generic.go:334] "Generic (PLEG): container finished" podID="1d118cd8-5e30-48cc-aee4-db15dfb65704" containerID="cf66cf254afe632778a8748f2f6bcac4a8b8451ec367729eb4ff712c93100586" exitCode=0 Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.225557 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vxq2p" event={"ID":"1d118cd8-5e30-48cc-aee4-db15dfb65704","Type":"ContainerDied","Data":"cf66cf254afe632778a8748f2f6bcac4a8b8451ec367729eb4ff712c93100586"} Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.225628 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vxq2p" event={"ID":"1d118cd8-5e30-48cc-aee4-db15dfb65704","Type":"ContainerStarted","Data":"a646a1e5ec54b48d6cf7fcf086f02c72b0e6893603cbd083e54a1c3bdecbb65d"} Oct 03 09:01:19 crc kubenswrapper[4810]: I1003 09:01:19.228550 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s9mvg" event={"ID":"4b1790a9-927d-4618-9eb7-c5ced54af4a9","Type":"ContainerStarted","Data":"3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623"} Oct 03 09:01:19 crc kubenswrapper[4810]: W1003 09:01:19.250096 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e2d06bb_19a4_4aa4_aec7_783a567726db.slice/crio-7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37 WatchSource:0}: Error finding container 7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37: Status 404 returned error can't find the container with id 7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37 Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.241452 4810 generic.go:334] "Generic (PLEG): container finished" podID="3e2d06bb-19a4-4aa4-aec7-783a567726db" containerID="43165b7a64643ef3cab4e29178f0beafafa146c94e61751d37588312b350db8e" exitCode=0 Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.241563 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfcdh" event={"ID":"3e2d06bb-19a4-4aa4-aec7-783a567726db","Type":"ContainerDied","Data":"43165b7a64643ef3cab4e29178f0beafafa146c94e61751d37588312b350db8e"} Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.242477 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfcdh" event={"ID":"3e2d06bb-19a4-4aa4-aec7-783a567726db","Type":"ContainerStarted","Data":"7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37"} Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.245221 4810 generic.go:334] "Generic (PLEG): container finished" podID="4b1790a9-927d-4618-9eb7-c5ced54af4a9" containerID="040748a82a5e2224d0965fac2adc7254c91b10c0aa83bf36ab8b1bd8a548c542" exitCode=0 Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.245338 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s9mvg" event={"ID":"4b1790a9-927d-4618-9eb7-c5ced54af4a9","Type":"ContainerDied","Data":"040748a82a5e2224d0965fac2adc7254c91b10c0aa83bf36ab8b1bd8a548c542"} Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.659019 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.782771 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brfss\" (UniqueName: \"kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss\") pod \"1d118cd8-5e30-48cc-aee4-db15dfb65704\" (UID: \"1d118cd8-5e30-48cc-aee4-db15dfb65704\") " Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.788318 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss" (OuterVolumeSpecName: "kube-api-access-brfss") pod "1d118cd8-5e30-48cc-aee4-db15dfb65704" (UID: "1d118cd8-5e30-48cc-aee4-db15dfb65704"). InnerVolumeSpecName "kube-api-access-brfss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:20 crc kubenswrapper[4810]: I1003 09:01:20.885071 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brfss\" (UniqueName: \"kubernetes.io/projected/1d118cd8-5e30-48cc-aee4-db15dfb65704-kube-api-access-brfss\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.259294 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vxq2p" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.259583 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vxq2p" event={"ID":"1d118cd8-5e30-48cc-aee4-db15dfb65704","Type":"ContainerDied","Data":"a646a1e5ec54b48d6cf7fcf086f02c72b0e6893603cbd083e54a1c3bdecbb65d"} Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.259619 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a646a1e5ec54b48d6cf7fcf086f02c72b0e6893603cbd083e54a1c3bdecbb65d" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.640305 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.646472 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.699358 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbcsw\" (UniqueName: \"kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw\") pod \"4b1790a9-927d-4618-9eb7-c5ced54af4a9\" (UID: \"4b1790a9-927d-4618-9eb7-c5ced54af4a9\") " Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.699514 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvv7f\" (UniqueName: \"kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f\") pod \"3e2d06bb-19a4-4aa4-aec7-783a567726db\" (UID: \"3e2d06bb-19a4-4aa4-aec7-783a567726db\") " Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.704830 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f" (OuterVolumeSpecName: "kube-api-access-qvv7f") pod "3e2d06bb-19a4-4aa4-aec7-783a567726db" (UID: "3e2d06bb-19a4-4aa4-aec7-783a567726db"). InnerVolumeSpecName "kube-api-access-qvv7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.709522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw" (OuterVolumeSpecName: "kube-api-access-wbcsw") pod "4b1790a9-927d-4618-9eb7-c5ced54af4a9" (UID: "4b1790a9-927d-4618-9eb7-c5ced54af4a9"). InnerVolumeSpecName "kube-api-access-wbcsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.801524 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbcsw\" (UniqueName: \"kubernetes.io/projected/4b1790a9-927d-4618-9eb7-c5ced54af4a9-kube-api-access-wbcsw\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:21 crc kubenswrapper[4810]: I1003 09:01:21.801566 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvv7f\" (UniqueName: \"kubernetes.io/projected/3e2d06bb-19a4-4aa4-aec7-783a567726db-kube-api-access-qvv7f\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.270316 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfcdh" Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.270302 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfcdh" event={"ID":"3e2d06bb-19a4-4aa4-aec7-783a567726db","Type":"ContainerDied","Data":"7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37"} Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.270476 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7038031bc04f65462982f8370d2db23627888d49238f1fd07eb92c6108f2fc37" Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.271909 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-s9mvg" event={"ID":"4b1790a9-927d-4618-9eb7-c5ced54af4a9","Type":"ContainerDied","Data":"3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623"} Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.271949 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3be57b4e8657d18cc744a4d996ac5c74fb4fddd899ba540e25668445a8a9c623" Oct 03 09:01:22 crc kubenswrapper[4810]: I1003 09:01:22.272017 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-s9mvg" Oct 03 09:01:23 crc kubenswrapper[4810]: I1003 09:01:23.031226 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b067-account-create-fcrcx"] Oct 03 09:01:23 crc kubenswrapper[4810]: I1003 09:01:23.042218 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b067-account-create-fcrcx"] Oct 03 09:01:23 crc kubenswrapper[4810]: I1003 09:01:23.321943 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d008017-c649-4878-b691-07cbc101901f" path="/var/lib/kubelet/pods/7d008017-c649-4878-b691-07cbc101901f/volumes" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.369636 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6644-account-create-pflxp"] Oct 03 09:01:28 crc kubenswrapper[4810]: E1003 09:01:28.371570 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2d06bb-19a4-4aa4-aec7-783a567726db" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.371607 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2d06bb-19a4-4aa4-aec7-783a567726db" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: E1003 09:01:28.371702 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d118cd8-5e30-48cc-aee4-db15dfb65704" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.371710 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d118cd8-5e30-48cc-aee4-db15dfb65704" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: E1003 09:01:28.371783 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b1790a9-927d-4618-9eb7-c5ced54af4a9" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.371791 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b1790a9-927d-4618-9eb7-c5ced54af4a9" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.372039 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d118cd8-5e30-48cc-aee4-db15dfb65704" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.372074 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2d06bb-19a4-4aa4-aec7-783a567726db" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.372086 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b1790a9-927d-4618-9eb7-c5ced54af4a9" containerName="mariadb-database-create" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.372827 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.376968 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.384045 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6644-account-create-pflxp"] Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.433809 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h96g\" (UniqueName: \"kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g\") pod \"nova-api-6644-account-create-pflxp\" (UID: \"b53b561d-83d8-4ca0-bfaa-a3f8473373fa\") " pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.536934 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h96g\" (UniqueName: \"kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g\") pod \"nova-api-6644-account-create-pflxp\" (UID: \"b53b561d-83d8-4ca0-bfaa-a3f8473373fa\") " pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.552690 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d0e8-account-create-f4k84"] Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.560498 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.564322 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.569171 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h96g\" (UniqueName: \"kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g\") pod \"nova-api-6644-account-create-pflxp\" (UID: \"b53b561d-83d8-4ca0-bfaa-a3f8473373fa\") " pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.575705 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d0e8-account-create-f4k84"] Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.639106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5nj8\" (UniqueName: \"kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8\") pod \"nova-cell0-d0e8-account-create-f4k84\" (UID: \"dda5331c-3fec-4b0e-85da-2783fc44b5a2\") " pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.697065 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.741442 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5nj8\" (UniqueName: \"kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8\") pod \"nova-cell0-d0e8-account-create-f4k84\" (UID: \"dda5331c-3fec-4b0e-85da-2783fc44b5a2\") " pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.757634 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3fd2-account-create-qmhk4"] Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.759458 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.762244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.764599 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5nj8\" (UniqueName: \"kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8\") pod \"nova-cell0-d0e8-account-create-f4k84\" (UID: \"dda5331c-3fec-4b0e-85da-2783fc44b5a2\") " pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.768876 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3fd2-account-create-qmhk4"] Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.845831 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh82c\" (UniqueName: \"kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c\") pod \"nova-cell1-3fd2-account-create-qmhk4\" (UID: \"60dae612-9f86-4dea-a7ca-c641b3622c7d\") " pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.912950 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.947468 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh82c\" (UniqueName: \"kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c\") pod \"nova-cell1-3fd2-account-create-qmhk4\" (UID: \"60dae612-9f86-4dea-a7ca-c641b3622c7d\") " pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:28 crc kubenswrapper[4810]: I1003 09:01:28.969997 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh82c\" (UniqueName: \"kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c\") pod \"nova-cell1-3fd2-account-create-qmhk4\" (UID: \"60dae612-9f86-4dea-a7ca-c641b3622c7d\") " pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.151079 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.166003 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6644-account-create-pflxp"] Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.353696 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d0e8-account-create-f4k84"] Oct 03 09:01:29 crc kubenswrapper[4810]: W1003 09:01:29.363526 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddda5331c_3fec_4b0e_85da_2783fc44b5a2.slice/crio-087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152 WatchSource:0}: Error finding container 087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152: Status 404 returned error can't find the container with id 087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152 Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.375674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6644-account-create-pflxp" event={"ID":"b53b561d-83d8-4ca0-bfaa-a3f8473373fa","Type":"ContainerStarted","Data":"e14f117901d3ecf707af80f5154071e005064daf8d9971cd1b6b48315e536ea9"} Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.375766 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6644-account-create-pflxp" event={"ID":"b53b561d-83d8-4ca0-bfaa-a3f8473373fa","Type":"ContainerStarted","Data":"f0f8619d9da62a9b3358e4e2c5091592b0053dddead75bacffcef22536c5171a"} Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.399869 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-6644-account-create-pflxp" podStartSLOduration=1.3998506019999999 podStartE2EDuration="1.399850602s" podCreationTimestamp="2025-10-03 09:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:29.394563871 +0000 UTC m=+7522.821814606" watchObservedRunningTime="2025-10-03 09:01:29.399850602 +0000 UTC m=+7522.827101337" Oct 03 09:01:29 crc kubenswrapper[4810]: I1003 09:01:29.581936 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3fd2-account-create-qmhk4"] Oct 03 09:01:29 crc kubenswrapper[4810]: W1003 09:01:29.584307 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60dae612_9f86_4dea_a7ca_c641b3622c7d.slice/crio-e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572 WatchSource:0}: Error finding container e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572: Status 404 returned error can't find the container with id e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572 Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.391558 4810 generic.go:334] "Generic (PLEG): container finished" podID="60dae612-9f86-4dea-a7ca-c641b3622c7d" containerID="ff9e2e550b1bf5f4b6636a6d608048c84dd243ea8639895d89b0add8d1330eac" exitCode=0 Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.391720 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" event={"ID":"60dae612-9f86-4dea-a7ca-c641b3622c7d","Type":"ContainerDied","Data":"ff9e2e550b1bf5f4b6636a6d608048c84dd243ea8639895d89b0add8d1330eac"} Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.391772 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" event={"ID":"60dae612-9f86-4dea-a7ca-c641b3622c7d","Type":"ContainerStarted","Data":"e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572"} Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.397918 4810 generic.go:334] "Generic (PLEG): container finished" podID="b53b561d-83d8-4ca0-bfaa-a3f8473373fa" containerID="e14f117901d3ecf707af80f5154071e005064daf8d9971cd1b6b48315e536ea9" exitCode=0 Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.398111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6644-account-create-pflxp" event={"ID":"b53b561d-83d8-4ca0-bfaa-a3f8473373fa","Type":"ContainerDied","Data":"e14f117901d3ecf707af80f5154071e005064daf8d9971cd1b6b48315e536ea9"} Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.402072 4810 generic.go:334] "Generic (PLEG): container finished" podID="dda5331c-3fec-4b0e-85da-2783fc44b5a2" containerID="f8e127303bc8a535b97b0d6e9b163d2ee3db2eaaf4c2ccfc515413c4ceb5fcbc" exitCode=0 Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.402202 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0e8-account-create-f4k84" event={"ID":"dda5331c-3fec-4b0e-85da-2783fc44b5a2","Type":"ContainerDied","Data":"f8e127303bc8a535b97b0d6e9b163d2ee3db2eaaf4c2ccfc515413c4ceb5fcbc"} Oct 03 09:01:30 crc kubenswrapper[4810]: I1003 09:01:30.402369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0e8-account-create-f4k84" event={"ID":"dda5331c-3fec-4b0e-85da-2783fc44b5a2","Type":"ContainerStarted","Data":"087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152"} Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.863856 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.872089 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.879358 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.906207 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh82c\" (UniqueName: \"kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c\") pod \"60dae612-9f86-4dea-a7ca-c641b3622c7d\" (UID: \"60dae612-9f86-4dea-a7ca-c641b3622c7d\") " Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.906461 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h96g\" (UniqueName: \"kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g\") pod \"b53b561d-83d8-4ca0-bfaa-a3f8473373fa\" (UID: \"b53b561d-83d8-4ca0-bfaa-a3f8473373fa\") " Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.912930 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g" (OuterVolumeSpecName: "kube-api-access-4h96g") pod "b53b561d-83d8-4ca0-bfaa-a3f8473373fa" (UID: "b53b561d-83d8-4ca0-bfaa-a3f8473373fa"). InnerVolumeSpecName "kube-api-access-4h96g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:31 crc kubenswrapper[4810]: I1003 09:01:31.913188 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c" (OuterVolumeSpecName: "kube-api-access-sh82c") pod "60dae612-9f86-4dea-a7ca-c641b3622c7d" (UID: "60dae612-9f86-4dea-a7ca-c641b3622c7d"). InnerVolumeSpecName "kube-api-access-sh82c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.008497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5nj8\" (UniqueName: \"kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8\") pod \"dda5331c-3fec-4b0e-85da-2783fc44b5a2\" (UID: \"dda5331c-3fec-4b0e-85da-2783fc44b5a2\") " Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.009444 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h96g\" (UniqueName: \"kubernetes.io/projected/b53b561d-83d8-4ca0-bfaa-a3f8473373fa-kube-api-access-4h96g\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.009463 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh82c\" (UniqueName: \"kubernetes.io/projected/60dae612-9f86-4dea-a7ca-c641b3622c7d-kube-api-access-sh82c\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.013339 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8" (OuterVolumeSpecName: "kube-api-access-n5nj8") pod "dda5331c-3fec-4b0e-85da-2783fc44b5a2" (UID: "dda5331c-3fec-4b0e-85da-2783fc44b5a2"). InnerVolumeSpecName "kube-api-access-n5nj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.088665 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.088734 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.110914 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5nj8\" (UniqueName: \"kubernetes.io/projected/dda5331c-3fec-4b0e-85da-2783fc44b5a2-kube-api-access-n5nj8\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.429847 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d0e8-account-create-f4k84" event={"ID":"dda5331c-3fec-4b0e-85da-2783fc44b5a2","Type":"ContainerDied","Data":"087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152"} Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.429910 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="087278b589cdcbef000a870654cde44b2a152095a26131cf32602b94af865152" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.429954 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d0e8-account-create-f4k84" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.431879 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" event={"ID":"60dae612-9f86-4dea-a7ca-c641b3622c7d","Type":"ContainerDied","Data":"e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572"} Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.431978 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e49e5c4026fd35227ead11ee919a9a37198812b3cd66ca1c2273e444c14a6572" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.431974 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3fd2-account-create-qmhk4" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.437266 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6644-account-create-pflxp" event={"ID":"b53b561d-83d8-4ca0-bfaa-a3f8473373fa","Type":"ContainerDied","Data":"f0f8619d9da62a9b3358e4e2c5091592b0053dddead75bacffcef22536c5171a"} Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.437320 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0f8619d9da62a9b3358e4e2c5091592b0053dddead75bacffcef22536c5171a" Oct 03 09:01:32 crc kubenswrapper[4810]: I1003 09:01:32.437396 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6644-account-create-pflxp" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.705404 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qcck4"] Oct 03 09:01:33 crc kubenswrapper[4810]: E1003 09:01:33.722589 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60dae612-9f86-4dea-a7ca-c641b3622c7d" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.722633 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="60dae612-9f86-4dea-a7ca-c641b3622c7d" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: E1003 09:01:33.722673 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda5331c-3fec-4b0e-85da-2783fc44b5a2" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.722685 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda5331c-3fec-4b0e-85da-2783fc44b5a2" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: E1003 09:01:33.722714 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b53b561d-83d8-4ca0-bfaa-a3f8473373fa" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.722725 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b53b561d-83d8-4ca0-bfaa-a3f8473373fa" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.723424 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="60dae612-9f86-4dea-a7ca-c641b3622c7d" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.723442 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda5331c-3fec-4b0e-85da-2783fc44b5a2" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.723474 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b53b561d-83d8-4ca0-bfaa-a3f8473373fa" containerName="mariadb-account-create" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.726078 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.732384 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.732618 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.733488 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrjql" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.736192 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qcck4"] Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.851416 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.851511 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsrz\" (UniqueName: \"kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.851661 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.851722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.953361 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.953409 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsrz\" (UniqueName: \"kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.953513 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.953558 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.957805 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.958002 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.958763 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:33 crc kubenswrapper[4810]: I1003 09:01:33.990912 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsrz\" (UniqueName: \"kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz\") pod \"nova-cell0-conductor-db-sync-qcck4\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:34 crc kubenswrapper[4810]: I1003 09:01:34.072104 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:34 crc kubenswrapper[4810]: I1003 09:01:34.522046 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qcck4"] Oct 03 09:01:34 crc kubenswrapper[4810]: W1003 09:01:34.532132 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57a4752c_c640_4ea1_9044_a70c59b85c50.slice/crio-3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3 WatchSource:0}: Error finding container 3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3: Status 404 returned error can't find the container with id 3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3 Oct 03 09:01:34 crc kubenswrapper[4810]: I1003 09:01:34.534656 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:01:35 crc kubenswrapper[4810]: I1003 09:01:35.052285 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-shp2j"] Oct 03 09:01:35 crc kubenswrapper[4810]: I1003 09:01:35.062974 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-shp2j"] Oct 03 09:01:35 crc kubenswrapper[4810]: I1003 09:01:35.316604 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9235a01f-b369-484b-bc48-fa523d863294" path="/var/lib/kubelet/pods/9235a01f-b369-484b-bc48-fa523d863294/volumes" Oct 03 09:01:35 crc kubenswrapper[4810]: I1003 09:01:35.464761 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qcck4" event={"ID":"57a4752c-c640-4ea1-9044-a70c59b85c50","Type":"ContainerStarted","Data":"3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3"} Oct 03 09:01:43 crc kubenswrapper[4810]: I1003 09:01:43.551045 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qcck4" event={"ID":"57a4752c-c640-4ea1-9044-a70c59b85c50","Type":"ContainerStarted","Data":"6fe224bc71d716f02d19e02a0d34c59755d48f97a13da3c8af44eb5869dc1b3f"} Oct 03 09:01:43 crc kubenswrapper[4810]: I1003 09:01:43.578001 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qcck4" podStartSLOduration=2.428567493 podStartE2EDuration="10.577977043s" podCreationTimestamp="2025-10-03 09:01:33 +0000 UTC" firstStartedPulling="2025-10-03 09:01:34.534443466 +0000 UTC m=+7527.961694201" lastFinishedPulling="2025-10-03 09:01:42.683853026 +0000 UTC m=+7536.111103751" observedRunningTime="2025-10-03 09:01:43.577306115 +0000 UTC m=+7537.004556850" watchObservedRunningTime="2025-10-03 09:01:43.577977043 +0000 UTC m=+7537.005227778" Oct 03 09:01:48 crc kubenswrapper[4810]: I1003 09:01:48.598073 4810 generic.go:334] "Generic (PLEG): container finished" podID="57a4752c-c640-4ea1-9044-a70c59b85c50" containerID="6fe224bc71d716f02d19e02a0d34c59755d48f97a13da3c8af44eb5869dc1b3f" exitCode=0 Oct 03 09:01:48 crc kubenswrapper[4810]: I1003 09:01:48.598221 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qcck4" event={"ID":"57a4752c-c640-4ea1-9044-a70c59b85c50","Type":"ContainerDied","Data":"6fe224bc71d716f02d19e02a0d34c59755d48f97a13da3c8af44eb5869dc1b3f"} Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.029498 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kgg9h"] Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.036419 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kgg9h"] Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.319062 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149a19cb-62bc-4739-a90d-7366c2147b1d" path="/var/lib/kubelet/pods/149a19cb-62bc-4739-a90d-7366c2147b1d/volumes" Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.957165 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.978621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data\") pod \"57a4752c-c640-4ea1-9044-a70c59b85c50\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.978678 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsrz\" (UniqueName: \"kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz\") pod \"57a4752c-c640-4ea1-9044-a70c59b85c50\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.978843 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle\") pod \"57a4752c-c640-4ea1-9044-a70c59b85c50\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.978879 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts\") pod \"57a4752c-c640-4ea1-9044-a70c59b85c50\" (UID: \"57a4752c-c640-4ea1-9044-a70c59b85c50\") " Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.986016 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz" (OuterVolumeSpecName: "kube-api-access-6xsrz") pod "57a4752c-c640-4ea1-9044-a70c59b85c50" (UID: "57a4752c-c640-4ea1-9044-a70c59b85c50"). InnerVolumeSpecName "kube-api-access-6xsrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:49 crc kubenswrapper[4810]: I1003 09:01:49.989125 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts" (OuterVolumeSpecName: "scripts") pod "57a4752c-c640-4ea1-9044-a70c59b85c50" (UID: "57a4752c-c640-4ea1-9044-a70c59b85c50"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.011314 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data" (OuterVolumeSpecName: "config-data") pod "57a4752c-c640-4ea1-9044-a70c59b85c50" (UID: "57a4752c-c640-4ea1-9044-a70c59b85c50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.014307 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a4752c-c640-4ea1-9044-a70c59b85c50" (UID: "57a4752c-c640-4ea1-9044-a70c59b85c50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.080643 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.080682 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsrz\" (UniqueName: \"kubernetes.io/projected/57a4752c-c640-4ea1-9044-a70c59b85c50-kube-api-access-6xsrz\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.080693 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.080701 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57a4752c-c640-4ea1-9044-a70c59b85c50-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.619929 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qcck4" event={"ID":"57a4752c-c640-4ea1-9044-a70c59b85c50","Type":"ContainerDied","Data":"3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3"} Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.619973 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3972cd3789273c4f51638fec380013f7bcbbe0e2d37a00429a72b7566e8f01f3" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.620036 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qcck4" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.698264 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:01:50 crc kubenswrapper[4810]: E1003 09:01:50.698668 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a4752c-c640-4ea1-9044-a70c59b85c50" containerName="nova-cell0-conductor-db-sync" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.698685 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a4752c-c640-4ea1-9044-a70c59b85c50" containerName="nova-cell0-conductor-db-sync" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.698858 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a4752c-c640-4ea1-9044-a70c59b85c50" containerName="nova-cell0-conductor-db-sync" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.699481 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.704939 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrjql" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.706098 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.710032 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.792831 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.793284 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.793319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ss5j\" (UniqueName: \"kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.894833 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.895428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.895558 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ss5j\" (UniqueName: \"kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.900047 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.901983 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:50 crc kubenswrapper[4810]: I1003 09:01:50.917656 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ss5j\" (UniqueName: \"kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j\") pod \"nova-cell0-conductor-0\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:51 crc kubenswrapper[4810]: I1003 09:01:51.017872 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:51 crc kubenswrapper[4810]: I1003 09:01:51.462576 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:01:51 crc kubenswrapper[4810]: I1003 09:01:51.634273 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72f95e65-4a0d-4f33-b977-abca4f20aeef","Type":"ContainerStarted","Data":"343a8f5e3e000ebfa8bc8688340b12ec57a2289f81805634ddaa24931011bdb5"} Oct 03 09:01:52 crc kubenswrapper[4810]: I1003 09:01:52.657796 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72f95e65-4a0d-4f33-b977-abca4f20aeef","Type":"ContainerStarted","Data":"3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4"} Oct 03 09:01:52 crc kubenswrapper[4810]: I1003 09:01:52.658207 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:52 crc kubenswrapper[4810]: I1003 09:01:52.683602 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.683584887 podStartE2EDuration="2.683584887s" podCreationTimestamp="2025-10-03 09:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:52.676062755 +0000 UTC m=+7546.103313510" watchObservedRunningTime="2025-10-03 09:01:52.683584887 +0000 UTC m=+7546.110835622" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.055331 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.492375 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pgz6g"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.496991 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.499171 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.499471 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.508889 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pgz6g"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.521787 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.521977 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpx5t\" (UniqueName: \"kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.522078 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.522580 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.624487 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.624850 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.624972 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpx5t\" (UniqueName: \"kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.625088 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.633667 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.636473 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.648136 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.650737 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.655850 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.659241 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.665435 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpx5t\" (UniqueName: \"kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t\") pod \"nova-cell0-cell-mapping-pgz6g\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.674447 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.701018 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.703055 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.713724 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726775 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt6n4\" (UniqueName: \"kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726847 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726906 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726936 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726967 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.726986 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ntp\" (UniqueName: \"kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.727009 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.742936 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.752022 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.753756 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.756590 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.762122 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.830914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt6n4\" (UniqueName: \"kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.830954 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831002 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831021 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831054 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831082 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831100 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831126 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831144 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ntp\" (UniqueName: \"kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831163 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.831193 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xkv\" (UniqueName: \"kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.832750 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.835375 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.837455 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.837697 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.849727 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.851457 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.859351 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.862663 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.872502 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt6n4\" (UniqueName: \"kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4\") pod \"nova-metadata-0\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " pod="openstack/nova-metadata-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.872594 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.896809 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.898344 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.907608 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ntp\" (UniqueName: \"kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp\") pod \"nova-cell1-novncproxy-0\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.914061 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.928052 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933222 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933289 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933360 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xkv\" (UniqueName: \"kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933440 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933837 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.933936 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934408 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934439 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934514 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934576 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vr7p\" (UniqueName: \"kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934610 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8g9j\" (UniqueName: \"kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.934681 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.939539 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.949095 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.955820 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xkv\" (UniqueName: \"kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv\") pod \"nova-api-0\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " pod="openstack/nova-api-0" Oct 03 09:01:56 crc kubenswrapper[4810]: I1003 09:01:56.965325 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036318 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8g9j\" (UniqueName: \"kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036379 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vr7p\" (UniqueName: \"kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036488 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036628 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036692 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036756 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.036799 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.038152 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.038504 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.039580 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.042399 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.043220 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.050512 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.062912 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8g9j\" (UniqueName: \"kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j\") pod \"nova-scheduler-0\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.063238 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vr7p\" (UniqueName: \"kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p\") pod \"dnsmasq-dns-67f978bc79-wq9xp\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.111444 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.130781 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.278683 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.293881 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.334523 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.413026 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:57 crc kubenswrapper[4810]: W1003 09:01:57.439242 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce94c5f8_8c8a_4081_9c35_1efaad7cb4fd.slice/crio-00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033 WatchSource:0}: Error finding container 00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033: Status 404 returned error can't find the container with id 00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033 Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.468239 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pgz6g"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.531884 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xht9"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.533072 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.542478 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.543702 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.547601 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xht9"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.654329 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.654416 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.654455 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.654635 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6zhz\" (UniqueName: \"kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.759917 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.760106 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.760290 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6zhz\" (UniqueName: \"kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.760330 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.760369 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.768193 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.769154 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerStarted","Data":"1fef4720445ca3b684e6e5c5ca0b527ab309f85d518bd1933ac2baa9698a3b4a"} Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.769402 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.772016 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pgz6g" event={"ID":"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9","Type":"ContainerStarted","Data":"dbecd6cb4776c31357e1271ab6ac3c843e982451b993b84ecf53c0a4ebaabc01"} Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.772052 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pgz6g" event={"ID":"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9","Type":"ContainerStarted","Data":"318daca35a877b62bfc6ce61b4f90d2c4e1146b30dc2a6ad6848cad2ed18fc09"} Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.774255 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd","Type":"ContainerStarted","Data":"00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033"} Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.781423 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.793128 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6zhz\" (UniqueName: \"kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz\") pod \"nova-cell1-conductor-db-sync-9xht9\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.814369 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pgz6g" podStartSLOduration=1.814344568 podStartE2EDuration="1.814344568s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:57.799767599 +0000 UTC m=+7551.227018354" watchObservedRunningTime="2025-10-03 09:01:57.814344568 +0000 UTC m=+7551.241595313" Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.854984 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.861802 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:01:57 crc kubenswrapper[4810]: I1003 09:01:57.868380 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:01:57 crc kubenswrapper[4810]: W1003 09:01:57.868532 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod627c0a25_d0c9_4b90_beeb_de5398d1fb7c.slice/crio-ac2b13d2de655fad339b80849e016fdb119532f508221bbcb0054deeed91bdcd WatchSource:0}: Error finding container ac2b13d2de655fad339b80849e016fdb119532f508221bbcb0054deeed91bdcd: Status 404 returned error can't find the container with id ac2b13d2de655fad339b80849e016fdb119532f508221bbcb0054deeed91bdcd Oct 03 09:01:57 crc kubenswrapper[4810]: W1003 09:01:57.870187 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb80b5eb_1cc6_485c_a9fe_a5a5821b4b62.slice/crio-9bf7075e332d831606a48bae71f04fa6b95e31613d1398026e5cad73e3bc9dff WatchSource:0}: Error finding container 9bf7075e332d831606a48bae71f04fa6b95e31613d1398026e5cad73e3bc9dff: Status 404 returned error can't find the container with id 9bf7075e332d831606a48bae71f04fa6b95e31613d1398026e5cad73e3bc9dff Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.382248 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xht9"] Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.816319 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xht9" event={"ID":"41ba9c65-630f-4a79-9233-ad8cecbcbf40","Type":"ContainerStarted","Data":"b13a01ff6ae65e74349efa452f2b733e4245ac32f870e81bf70bb924cdfbec41"} Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.827039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerStarted","Data":"22528f90f86a294c560b1269a7a12eb422d61ea24a474eceeede497b7732e574"} Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.829935 4810 generic.go:334] "Generic (PLEG): container finished" podID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerID="e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3" exitCode=0 Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.830016 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" event={"ID":"627c0a25-d0c9-4b90-beeb-de5398d1fb7c","Type":"ContainerDied","Data":"e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3"} Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.830044 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" event={"ID":"627c0a25-d0c9-4b90-beeb-de5398d1fb7c","Type":"ContainerStarted","Data":"ac2b13d2de655fad339b80849e016fdb119532f508221bbcb0054deeed91bdcd"} Oct 03 09:01:58 crc kubenswrapper[4810]: I1003 09:01:58.838486 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62","Type":"ContainerStarted","Data":"9bf7075e332d831606a48bae71f04fa6b95e31613d1398026e5cad73e3bc9dff"} Oct 03 09:01:59 crc kubenswrapper[4810]: I1003 09:01:59.855139 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xht9" event={"ID":"41ba9c65-630f-4a79-9233-ad8cecbcbf40","Type":"ContainerStarted","Data":"756e3fa73811a7d6d4b37ad57ce515d218bc088c49d197f258e0fc1126478923"} Oct 03 09:01:59 crc kubenswrapper[4810]: I1003 09:01:59.874405 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9xht9" podStartSLOduration=2.874380554 podStartE2EDuration="2.874380554s" podCreationTimestamp="2025-10-03 09:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:59.869674378 +0000 UTC m=+7553.296925123" watchObservedRunningTime="2025-10-03 09:01:59.874380554 +0000 UTC m=+7553.301631290" Oct 03 09:02:01 crc kubenswrapper[4810]: I1003 09:02:01.456503 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:01 crc kubenswrapper[4810]: I1003 09:02:01.474749 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:02 crc kubenswrapper[4810]: I1003 09:02:02.088338 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:02:02 crc kubenswrapper[4810]: I1003 09:02:02.088407 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.904319 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerStarted","Data":"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.905169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerStarted","Data":"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.904521 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-metadata" containerID="cri-o://b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" gracePeriod=30 Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.904444 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-log" containerID="cri-o://0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" gracePeriod=30 Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.907419 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" event={"ID":"627c0a25-d0c9-4b90-beeb-de5398d1fb7c","Type":"ContainerStarted","Data":"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.907549 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.911509 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd","Type":"ContainerStarted","Data":"f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.911562 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735" gracePeriod=30 Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.914633 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62","Type":"ContainerStarted","Data":"e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.917502 4810 generic.go:334] "Generic (PLEG): container finished" podID="41ba9c65-630f-4a79-9233-ad8cecbcbf40" containerID="756e3fa73811a7d6d4b37ad57ce515d218bc088c49d197f258e0fc1126478923" exitCode=0 Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.917533 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xht9" event={"ID":"41ba9c65-630f-4a79-9233-ad8cecbcbf40","Type":"ContainerDied","Data":"756e3fa73811a7d6d4b37ad57ce515d218bc088c49d197f258e0fc1126478923"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.920713 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerStarted","Data":"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.920762 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerStarted","Data":"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.923372 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" containerID="dbecd6cb4776c31357e1271ab6ac3c843e982451b993b84ecf53c0a4ebaabc01" exitCode=0 Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.923412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pgz6g" event={"ID":"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9","Type":"ContainerDied","Data":"dbecd6cb4776c31357e1271ab6ac3c843e982451b993b84ecf53c0a4ebaabc01"} Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.930516 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.721366552 podStartE2EDuration="8.930496452s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="2025-10-03 09:01:57.781273806 +0000 UTC m=+7551.208524541" lastFinishedPulling="2025-10-03 09:02:03.990403696 +0000 UTC m=+7557.417654441" observedRunningTime="2025-10-03 09:02:04.928479418 +0000 UTC m=+7558.355730163" watchObservedRunningTime="2025-10-03 09:02:04.930496452 +0000 UTC m=+7558.357747187" Oct 03 09:02:04 crc kubenswrapper[4810]: I1003 09:02:04.978472 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" podStartSLOduration=8.978440013 podStartE2EDuration="8.978440013s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:04.965394635 +0000 UTC m=+7558.392645370" watchObservedRunningTime="2025-10-03 09:02:04.978440013 +0000 UTC m=+7558.405690768" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.012554 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.380571216 podStartE2EDuration="9.012519773s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="2025-10-03 09:01:57.358431539 +0000 UTC m=+7550.785682274" lastFinishedPulling="2025-10-03 09:02:03.990380096 +0000 UTC m=+7557.417630831" observedRunningTime="2025-10-03 09:02:04.997963715 +0000 UTC m=+7558.425214470" watchObservedRunningTime="2025-10-03 09:02:05.012519773 +0000 UTC m=+7558.439770518" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.028489 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.522484038 podStartE2EDuration="9.02846515s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="2025-10-03 09:01:57.45879657 +0000 UTC m=+7550.886047295" lastFinishedPulling="2025-10-03 09:02:03.964777672 +0000 UTC m=+7557.392028407" observedRunningTime="2025-10-03 09:02:05.018621297 +0000 UTC m=+7558.445872032" watchObservedRunningTime="2025-10-03 09:02:05.02846515 +0000 UTC m=+7558.455715885" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.511687 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.533643 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.45165626 podStartE2EDuration="9.533619185s" podCreationTimestamp="2025-10-03 09:01:56 +0000 UTC" firstStartedPulling="2025-10-03 09:01:57.882678294 +0000 UTC m=+7551.309929029" lastFinishedPulling="2025-10-03 09:02:03.964641229 +0000 UTC m=+7557.391891954" observedRunningTime="2025-10-03 09:02:05.05543549 +0000 UTC m=+7558.482686225" watchObservedRunningTime="2025-10-03 09:02:05.533619185 +0000 UTC m=+7558.960869920" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.632608 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs\") pod \"c9e67ba0-c853-4ba4-b58e-58209713890d\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.632734 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle\") pod \"c9e67ba0-c853-4ba4-b58e-58209713890d\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.632829 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data\") pod \"c9e67ba0-c853-4ba4-b58e-58209713890d\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.632888 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt6n4\" (UniqueName: \"kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4\") pod \"c9e67ba0-c853-4ba4-b58e-58209713890d\" (UID: \"c9e67ba0-c853-4ba4-b58e-58209713890d\") " Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.633109 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs" (OuterVolumeSpecName: "logs") pod "c9e67ba0-c853-4ba4-b58e-58209713890d" (UID: "c9e67ba0-c853-4ba4-b58e-58209713890d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.633673 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9e67ba0-c853-4ba4-b58e-58209713890d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.637675 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4" (OuterVolumeSpecName: "kube-api-access-nt6n4") pod "c9e67ba0-c853-4ba4-b58e-58209713890d" (UID: "c9e67ba0-c853-4ba4-b58e-58209713890d"). InnerVolumeSpecName "kube-api-access-nt6n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.657529 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data" (OuterVolumeSpecName: "config-data") pod "c9e67ba0-c853-4ba4-b58e-58209713890d" (UID: "c9e67ba0-c853-4ba4-b58e-58209713890d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.677789 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e67ba0-c853-4ba4-b58e-58209713890d" (UID: "c9e67ba0-c853-4ba4-b58e-58209713890d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.735832 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.735886 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt6n4\" (UniqueName: \"kubernetes.io/projected/c9e67ba0-c853-4ba4-b58e-58209713890d-kube-api-access-nt6n4\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.735925 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e67ba0-c853-4ba4-b58e-58209713890d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.934996 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerID="b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" exitCode=0 Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935038 4810 generic.go:334] "Generic (PLEG): container finished" podID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerID="0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" exitCode=143 Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935118 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935196 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerDied","Data":"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf"} Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935244 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerDied","Data":"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb"} Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935267 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c9e67ba0-c853-4ba4-b58e-58209713890d","Type":"ContainerDied","Data":"22528f90f86a294c560b1269a7a12eb422d61ea24a474eceeede497b7732e574"} Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.935295 4810 scope.go:117] "RemoveContainer" containerID="b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" Oct 03 09:02:05 crc kubenswrapper[4810]: I1003 09:02:05.974490 4810 scope.go:117] "RemoveContainer" containerID="0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.013592 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.033743 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.056425 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:06 crc kubenswrapper[4810]: E1003 09:02:06.057550 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-log" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.057572 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-log" Oct 03 09:02:06 crc kubenswrapper[4810]: E1003 09:02:06.057591 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-metadata" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.057598 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-metadata" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.057795 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-log" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.057821 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" containerName="nova-metadata-metadata" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.059042 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.066879 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.068870 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.069939 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.097343 4810 scope.go:117] "RemoveContainer" containerID="b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" Oct 03 09:02:06 crc kubenswrapper[4810]: E1003 09:02:06.101533 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf\": container with ID starting with b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf not found: ID does not exist" containerID="b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.101586 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf"} err="failed to get container status \"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf\": rpc error: code = NotFound desc = could not find container \"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf\": container with ID starting with b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf not found: ID does not exist" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.101625 4810 scope.go:117] "RemoveContainer" containerID="0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" Oct 03 09:02:06 crc kubenswrapper[4810]: E1003 09:02:06.106427 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb\": container with ID starting with 0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb not found: ID does not exist" containerID="0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.106479 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb"} err="failed to get container status \"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb\": rpc error: code = NotFound desc = could not find container \"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb\": container with ID starting with 0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb not found: ID does not exist" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.106514 4810 scope.go:117] "RemoveContainer" containerID="b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.113458 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf"} err="failed to get container status \"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf\": rpc error: code = NotFound desc = could not find container \"b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf\": container with ID starting with b1c04851515750f08434563d88482836c6cfa9375e1310b20db44ee24b6d1aaf not found: ID does not exist" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.113502 4810 scope.go:117] "RemoveContainer" containerID="0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.119540 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb"} err="failed to get container status \"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb\": rpc error: code = NotFound desc = could not find container \"0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb\": container with ID starting with 0983f6170b8b422ade037db94b23c5cfdd7feba692b5c3e5ea217c1038fb3ccb not found: ID does not exist" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.150006 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cl7h\" (UniqueName: \"kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.150077 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.150157 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.150201 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.150226 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.251661 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.251794 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cl7h\" (UniqueName: \"kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.251837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.251942 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.251966 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.254856 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.256444 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.258947 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.261169 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.276148 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cl7h\" (UniqueName: \"kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h\") pod \"nova-metadata-0\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.357088 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.392690 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.464802 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts\") pod \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.464874 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle\") pod \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.464960 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data\") pod \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.465548 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpx5t\" (UniqueName: \"kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t\") pod \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\" (UID: \"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.471071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts" (OuterVolumeSpecName: "scripts") pod "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" (UID: "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.473631 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t" (OuterVolumeSpecName: "kube-api-access-lpx5t") pod "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" (UID: "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9"). InnerVolumeSpecName "kube-api-access-lpx5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.495057 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" (UID: "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.534325 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data" (OuterVolumeSpecName: "config-data") pod "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" (UID: "1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.541814 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.579181 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpx5t\" (UniqueName: \"kubernetes.io/projected/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-kube-api-access-lpx5t\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.579247 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.579260 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.579275 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.681925 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6zhz\" (UniqueName: \"kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz\") pod \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.682107 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts\") pod \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.682147 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data\") pod \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.682247 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") pod \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.692851 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts" (OuterVolumeSpecName: "scripts") pod "41ba9c65-630f-4a79-9233-ad8cecbcbf40" (UID: "41ba9c65-630f-4a79-9233-ad8cecbcbf40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.717498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz" (OuterVolumeSpecName: "kube-api-access-g6zhz") pod "41ba9c65-630f-4a79-9233-ad8cecbcbf40" (UID: "41ba9c65-630f-4a79-9233-ad8cecbcbf40"). InnerVolumeSpecName "kube-api-access-g6zhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: E1003 09:02:06.735883 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle podName:41ba9c65-630f-4a79-9233-ad8cecbcbf40 nodeName:}" failed. No retries permitted until 2025-10-03 09:02:07.235834733 +0000 UTC m=+7560.663085468 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle") pod "41ba9c65-630f-4a79-9233-ad8cecbcbf40" (UID: "41ba9c65-630f-4a79-9233-ad8cecbcbf40") : error deleting /var/lib/kubelet/pods/41ba9c65-630f-4a79-9233-ad8cecbcbf40/volume-subpaths: remove /var/lib/kubelet/pods/41ba9c65-630f-4a79-9233-ad8cecbcbf40/volume-subpaths: no such file or directory Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.739576 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data" (OuterVolumeSpecName: "config-data") pod "41ba9c65-630f-4a79-9233-ad8cecbcbf40" (UID: "41ba9c65-630f-4a79-9233-ad8cecbcbf40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.784534 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6zhz\" (UniqueName: \"kubernetes.io/projected/41ba9c65-630f-4a79-9233-ad8cecbcbf40-kube-api-access-g6zhz\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.784574 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.784584 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.946197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pgz6g" event={"ID":"1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9","Type":"ContainerDied","Data":"318daca35a877b62bfc6ce61b4f90d2c4e1146b30dc2a6ad6848cad2ed18fc09"} Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.946252 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="318daca35a877b62bfc6ce61b4f90d2c4e1146b30dc2a6ad6848cad2ed18fc09" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.946306 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pgz6g" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.953298 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xht9" event={"ID":"41ba9c65-630f-4a79-9233-ad8cecbcbf40","Type":"ContainerDied","Data":"b13a01ff6ae65e74349efa452f2b733e4245ac32f870e81bf70bb924cdfbec41"} Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.953330 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b13a01ff6ae65e74349efa452f2b733e4245ac32f870e81bf70bb924cdfbec41" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.953394 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xht9" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.966045 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.966113 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:06 crc kubenswrapper[4810]: I1003 09:02:06.989797 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.068782 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: E1003 09:02:07.069642 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ba9c65-630f-4a79-9233-ad8cecbcbf40" containerName="nova-cell1-conductor-db-sync" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.069672 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ba9c65-630f-4a79-9233-ad8cecbcbf40" containerName="nova-cell1-conductor-db-sync" Oct 03 09:02:07 crc kubenswrapper[4810]: E1003 09:02:07.069708 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" containerName="nova-manage" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.069718 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" containerName="nova-manage" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.070005 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" containerName="nova-manage" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.070039 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ba9c65-630f-4a79-9233-ad8cecbcbf40" containerName="nova-cell1-conductor-db-sync" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.073955 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.079597 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.114987 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.200175 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.208289 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.208464 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.208537 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmgq\" (UniqueName: \"kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.209350 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.209742 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" containerName="nova-scheduler-scheduler" containerID="cri-o://e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.260474 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.279073 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.309830 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") pod \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\" (UID: \"41ba9c65-630f-4a79-9233-ad8cecbcbf40\") " Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.310245 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.310364 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.310398 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmgq\" (UniqueName: \"kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.315034 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.316375 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.324378 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ba9c65-630f-4a79-9233-ad8cecbcbf40" (UID: "41ba9c65-630f-4a79-9233-ad8cecbcbf40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.333340 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmgq\" (UniqueName: \"kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq\") pod \"nova-cell1-conductor-0\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.337324 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e67ba0-c853-4ba4-b58e-58209713890d" path="/var/lib/kubelet/pods/c9e67ba0-c853-4ba4-b58e-58209713890d/volumes" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.423583 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ba9c65-630f-4a79-9233-ad8cecbcbf40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.436600 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.937510 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.969229 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef","Type":"ContainerStarted","Data":"afbb2bf3aba0242f71f063b43970d8004c452d0196ad73904a67696ed10f99b8"} Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.979850 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-log" containerID="cri-o://9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980319 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-log" containerID="cri-o://bd3d09032be293ee570439dc544a171fa80769c7a04e80db8ddeac767dc95a48" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980380 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerStarted","Data":"8c36e2e8cd4f021ac4fec75baa0abcd4e71eebb83a3be8eed45f91852ac6d3b0"} Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980406 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerStarted","Data":"bd3d09032be293ee570439dc544a171fa80769c7a04e80db8ddeac767dc95a48"} Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980418 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerStarted","Data":"aebcd41e195e56e26a24b66279c6208e0adb2b176ac325f421ef56f0de6f51da"} Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980712 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-api" containerID="cri-o://7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.980790 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-metadata" containerID="cri-o://8c36e2e8cd4f021ac4fec75baa0abcd4e71eebb83a3be8eed45f91852ac6d3b0" gracePeriod=30 Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.990627 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.98:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:07 crc kubenswrapper[4810]: I1003 09:02:07.990674 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.98:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.004427 4810 generic.go:334] "Generic (PLEG): container finished" podID="12001d31-156d-4f6e-a242-a53ccacd1473" containerID="9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63" exitCode=143 Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.004641 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerDied","Data":"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63"} Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.008192 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef","Type":"ContainerStarted","Data":"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787"} Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.008402 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.013193 4810 generic.go:334] "Generic (PLEG): container finished" podID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerID="bd3d09032be293ee570439dc544a171fa80769c7a04e80db8ddeac767dc95a48" exitCode=143 Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.013293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerDied","Data":"bd3d09032be293ee570439dc544a171fa80769c7a04e80db8ddeac767dc95a48"} Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.028006 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.027983109 podStartE2EDuration="2.027983109s" podCreationTimestamp="2025-10-03 09:02:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:09.026532061 +0000 UTC m=+7562.453782796" watchObservedRunningTime="2025-10-03 09:02:09.027983109 +0000 UTC m=+7562.455233844" Oct 03 09:02:09 crc kubenswrapper[4810]: I1003 09:02:09.036880 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.036856447 podStartE2EDuration="3.036856447s" podCreationTimestamp="2025-10-03 09:02:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:08.023795602 +0000 UTC m=+7561.451046337" watchObservedRunningTime="2025-10-03 09:02:09.036856447 +0000 UTC m=+7562.464107182" Oct 03 09:02:11 crc kubenswrapper[4810]: I1003 09:02:11.393364 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:11 crc kubenswrapper[4810]: I1003 09:02:11.394002 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:12 crc kubenswrapper[4810]: I1003 09:02:12.296768 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:02:12 crc kubenswrapper[4810]: I1003 09:02:12.376101 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:02:12 crc kubenswrapper[4810]: I1003 09:02:12.376435 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="dnsmasq-dns" containerID="cri-o://5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe" gracePeriod=10 Oct 03 09:02:12 crc kubenswrapper[4810]: I1003 09:02:12.537642 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.82:5353: connect: connection refused" Oct 03 09:02:12 crc kubenswrapper[4810]: I1003 09:02:12.903631 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.051227 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdbz\" (UniqueName: \"kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz\") pod \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.052709 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config\") pod \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.052839 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb\") pod \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.053169 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc\") pod \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.053197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb\") pod \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\" (UID: \"8357f38e-0c4b-40d3-9c8c-06a5b127d871\") " Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.058683 4810 generic.go:334] "Generic (PLEG): container finished" podID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerID="5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe" exitCode=0 Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.058741 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" event={"ID":"8357f38e-0c4b-40d3-9c8c-06a5b127d871","Type":"ContainerDied","Data":"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe"} Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.058774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" event={"ID":"8357f38e-0c4b-40d3-9c8c-06a5b127d871","Type":"ContainerDied","Data":"1685d82be7357912e0edae2c4754023c5d69c5232f5258f5b95506e786018740"} Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.058801 4810 scope.go:117] "RemoveContainer" containerID="5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.059009 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz" (OuterVolumeSpecName: "kube-api-access-tmdbz") pod "8357f38e-0c4b-40d3-9c8c-06a5b127d871" (UID: "8357f38e-0c4b-40d3-9c8c-06a5b127d871"). InnerVolumeSpecName "kube-api-access-tmdbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.059022 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-795b7f6797-7wt5c" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.105737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8357f38e-0c4b-40d3-9c8c-06a5b127d871" (UID: "8357f38e-0c4b-40d3-9c8c-06a5b127d871"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.122201 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config" (OuterVolumeSpecName: "config") pod "8357f38e-0c4b-40d3-9c8c-06a5b127d871" (UID: "8357f38e-0c4b-40d3-9c8c-06a5b127d871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.129126 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8357f38e-0c4b-40d3-9c8c-06a5b127d871" (UID: "8357f38e-0c4b-40d3-9c8c-06a5b127d871"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.135224 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8357f38e-0c4b-40d3-9c8c-06a5b127d871" (UID: "8357f38e-0c4b-40d3-9c8c-06a5b127d871"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.156354 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdbz\" (UniqueName: \"kubernetes.io/projected/8357f38e-0c4b-40d3-9c8c-06a5b127d871-kube-api-access-tmdbz\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.156416 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.156433 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.156448 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.156468 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8357f38e-0c4b-40d3-9c8c-06a5b127d871-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.182319 4810 scope.go:117] "RemoveContainer" containerID="96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.216167 4810 scope.go:117] "RemoveContainer" containerID="5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe" Oct 03 09:02:13 crc kubenswrapper[4810]: E1003 09:02:13.216763 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe\": container with ID starting with 5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe not found: ID does not exist" containerID="5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.216804 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe"} err="failed to get container status \"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe\": rpc error: code = NotFound desc = could not find container \"5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe\": container with ID starting with 5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe not found: ID does not exist" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.216836 4810 scope.go:117] "RemoveContainer" containerID="96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd" Oct 03 09:02:13 crc kubenswrapper[4810]: E1003 09:02:13.217304 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd\": container with ID starting with 96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd not found: ID does not exist" containerID="96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.217345 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd"} err="failed to get container status \"96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd\": rpc error: code = NotFound desc = could not find container \"96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd\": container with ID starting with 96eb085c0035d851e5258a6b5dfa5223bf22827c33b969e4d0f5ce3b28e197cd not found: ID does not exist" Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.387985 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:02:13 crc kubenswrapper[4810]: I1003 09:02:13.399097 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-795b7f6797-7wt5c"] Oct 03 09:02:15 crc kubenswrapper[4810]: I1003 09:02:15.315828 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" path="/var/lib/kubelet/pods/8357f38e-0c4b-40d3-9c8c-06a5b127d871/volumes" Oct 03 09:02:16 crc kubenswrapper[4810]: I1003 09:02:16.951225 4810 scope.go:117] "RemoveContainer" containerID="456df9582fd0dabe43d26d0d1c99eaad082ba03a1a3d0ee1753963d93927930c" Oct 03 09:02:17 crc kubenswrapper[4810]: I1003 09:02:17.015750 4810 scope.go:117] "RemoveContainer" containerID="61f214431d6eb5882a71d4fd19c306f5db9cc8561556c04737bf1d37c0c93c80" Oct 03 09:02:17 crc kubenswrapper[4810]: I1003 09:02:17.067472 4810 scope.go:117] "RemoveContainer" containerID="6ad3d0a696c4ff3e86eadaee3703451b858cce5adeab38849fbc64554d7ae74f" Oct 03 09:02:17 crc kubenswrapper[4810]: I1003 09:02:17.464972 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 09:02:18 crc kubenswrapper[4810]: E1003 09:02:18.126292 4810 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/fff2e68b4bc936a17189405bf001e4366b684f6df184f4bad3ffe549c051de82/diff" to get inode usage: stat /var/lib/containers/storage/overlay/fff2e68b4bc936a17189405bf001e4366b684f6df184f4bad3ffe549c051de82/diff: no such file or directory, extraDiskErr: Oct 03 09:02:18 crc kubenswrapper[4810]: E1003 09:02:18.673460 4810 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/b15705d14e72d1d918843a3b785acf4e6a75b5aaef6cb8a0c5fa502207271493/diff" to get inode usage: stat /var/lib/containers/storage/overlay/b15705d14e72d1d918843a3b785acf4e6a75b5aaef6cb8a0c5fa502207271493/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-795b7f6797-7wt5c_8357f38e-0c4b-40d3-9c8c-06a5b127d871/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-795b7f6797-7wt5c_8357f38e-0c4b-40d3-9c8c-06a5b127d871/dnsmasq-dns/0.log: no such file or directory Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.742032 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.839395 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9xkv\" (UniqueName: \"kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv\") pod \"12001d31-156d-4f6e-a242-a53ccacd1473\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.839840 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data\") pod \"12001d31-156d-4f6e-a242-a53ccacd1473\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.839939 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs\") pod \"12001d31-156d-4f6e-a242-a53ccacd1473\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.840106 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle\") pod \"12001d31-156d-4f6e-a242-a53ccacd1473\" (UID: \"12001d31-156d-4f6e-a242-a53ccacd1473\") " Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.840986 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs" (OuterVolumeSpecName: "logs") pod "12001d31-156d-4f6e-a242-a53ccacd1473" (UID: "12001d31-156d-4f6e-a242-a53ccacd1473"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.847177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv" (OuterVolumeSpecName: "kube-api-access-f9xkv") pod "12001d31-156d-4f6e-a242-a53ccacd1473" (UID: "12001d31-156d-4f6e-a242-a53ccacd1473"). InnerVolumeSpecName "kube-api-access-f9xkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.870092 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data" (OuterVolumeSpecName: "config-data") pod "12001d31-156d-4f6e-a242-a53ccacd1473" (UID: "12001d31-156d-4f6e-a242-a53ccacd1473"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.873281 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12001d31-156d-4f6e-a242-a53ccacd1473" (UID: "12001d31-156d-4f6e-a242-a53ccacd1473"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.942991 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9xkv\" (UniqueName: \"kubernetes.io/projected/12001d31-156d-4f6e-a242-a53ccacd1473-kube-api-access-f9xkv\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.943048 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.943060 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12001d31-156d-4f6e-a242-a53ccacd1473-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:21 crc kubenswrapper[4810]: I1003 09:02:21.943071 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12001d31-156d-4f6e-a242-a53ccacd1473-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.148634 4810 generic.go:334] "Generic (PLEG): container finished" podID="12001d31-156d-4f6e-a242-a53ccacd1473" containerID="7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5" exitCode=0 Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.148682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerDied","Data":"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5"} Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.148717 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"12001d31-156d-4f6e-a242-a53ccacd1473","Type":"ContainerDied","Data":"1fef4720445ca3b684e6e5c5ca0b527ab309f85d518bd1933ac2baa9698a3b4a"} Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.148713 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.148735 4810 scope.go:117] "RemoveContainer" containerID="7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.184827 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.196856 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.198428 4810 scope.go:117] "RemoveContainer" containerID="9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.209963 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.210535 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-log" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210560 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-log" Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.210590 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="init" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210598 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="init" Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.210612 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-api" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210620 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-api" Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.210636 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="dnsmasq-dns" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210644 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="dnsmasq-dns" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210865 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8357f38e-0c4b-40d3-9c8c-06a5b127d871" containerName="dnsmasq-dns" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210877 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-api" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.210915 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" containerName="nova-api-log" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.212259 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.214595 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.218664 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.229095 4810 scope.go:117] "RemoveContainer" containerID="7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5" Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.229582 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5\": container with ID starting with 7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5 not found: ID does not exist" containerID="7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.229621 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5"} err="failed to get container status \"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5\": rpc error: code = NotFound desc = could not find container \"7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5\": container with ID starting with 7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5 not found: ID does not exist" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.229645 4810 scope.go:117] "RemoveContainer" containerID="9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63" Oct 03 09:02:22 crc kubenswrapper[4810]: E1003 09:02:22.230076 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63\": container with ID starting with 9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63 not found: ID does not exist" containerID="9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.230104 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63"} err="failed to get container status \"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63\": rpc error: code = NotFound desc = could not find container \"9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63\": container with ID starting with 9baf69d5296e2283a655a21f7c44862969811f9092a4d681dd8f43c78dfcee63 not found: ID does not exist" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.349093 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.349186 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.349256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vgc\" (UniqueName: \"kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.349323 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.451380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.451448 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.451512 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vgc\" (UniqueName: \"kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.451555 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.452401 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.456086 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.459073 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.471259 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vgc\" (UniqueName: \"kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc\") pod \"nova-api-0\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.543054 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:22 crc kubenswrapper[4810]: I1003 09:02:22.992049 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:23 crc kubenswrapper[4810]: I1003 09:02:23.159204 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerStarted","Data":"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b"} Oct 03 09:02:23 crc kubenswrapper[4810]: I1003 09:02:23.159469 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerStarted","Data":"5af67ea32b4ed280658f0c34799be88c1c4eaf7241cf80c847ac985bd3cf1b52"} Oct 03 09:02:23 crc kubenswrapper[4810]: I1003 09:02:23.314928 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12001d31-156d-4f6e-a242-a53ccacd1473" path="/var/lib/kubelet/pods/12001d31-156d-4f6e-a242-a53ccacd1473/volumes" Oct 03 09:02:24 crc kubenswrapper[4810]: I1003 09:02:24.169441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerStarted","Data":"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b"} Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.088342 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.088926 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.088970 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.089677 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.089725 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973" gracePeriod=600 Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.254351 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973" exitCode=0 Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.254412 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973"} Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.254458 4810 scope.go:117] "RemoveContainer" containerID="0ac69b4234b1ddf097fcca5b6254005060c91ad5afcd6f8fd4573021871dfc99" Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.544127 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:32 crc kubenswrapper[4810]: I1003 09:02:32.544508 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:33 crc kubenswrapper[4810]: I1003 09:02:33.274073 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da"} Oct 03 09:02:33 crc kubenswrapper[4810]: I1003 09:02:33.301155 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=11.301110886 podStartE2EDuration="11.301110886s" podCreationTimestamp="2025-10-03 09:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:24.190643734 +0000 UTC m=+7577.617894469" watchObservedRunningTime="2025-10-03 09:02:33.301110886 +0000 UTC m=+7586.728361621" Oct 03 09:02:33 crc kubenswrapper[4810]: I1003 09:02:33.627144 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:33 crc kubenswrapper[4810]: I1003 09:02:33.627172 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.304179 4810 generic.go:334] "Generic (PLEG): container finished" podID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" containerID="f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735" exitCode=137 Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.315039 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd","Type":"ContainerDied","Data":"f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735"} Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.315103 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd","Type":"ContainerDied","Data":"00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033"} Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.315122 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033" Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.372126 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.549187 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") pod \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.549730 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data\") pod \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.549842 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ntp\" (UniqueName: \"kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp\") pod \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.555308 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp" (OuterVolumeSpecName: "kube-api-access-48ntp") pod "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" (UID: "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd"). InnerVolumeSpecName "kube-api-access-48ntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:35 crc kubenswrapper[4810]: E1003 09:02:35.572378 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle podName:ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd nodeName:}" failed. No retries permitted until 2025-10-03 09:02:36.072335443 +0000 UTC m=+7589.499586198 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle") pod "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" (UID: "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd") : error deleting /var/lib/kubelet/pods/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd/volume-subpaths: remove /var/lib/kubelet/pods/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd/volume-subpaths: no such file or directory Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.575118 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data" (OuterVolumeSpecName: "config-data") pod "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" (UID: "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.652706 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:35 crc kubenswrapper[4810]: I1003 09:02:35.652752 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ntp\" (UniqueName: \"kubernetes.io/projected/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-kube-api-access-48ntp\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.159156 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") pod \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\" (UID: \"ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd\") " Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.162448 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" (UID: "ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.261781 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.313051 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.351432 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.359697 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.390835 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:36 crc kubenswrapper[4810]: E1003 09:02:36.391371 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.391393 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.399037 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.400191 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.402742 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.405479 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.406111 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.409416 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.567228 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzrq\" (UniqueName: \"kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.567319 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.567387 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.567676 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.567793 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.669457 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.669512 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.669549 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzrq\" (UniqueName: \"kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.669611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.669669 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.673441 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.674048 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.675073 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.675162 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.688285 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzrq\" (UniqueName: \"kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq\") pod \"nova-cell1-novncproxy-0\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:36 crc kubenswrapper[4810]: I1003 09:02:36.718951 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.208053 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.324683 4810 generic.go:334] "Generic (PLEG): container finished" podID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" containerID="e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5" exitCode=137 Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.363844 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd" path="/var/lib/kubelet/pods/ce94c5f8-8c8a-4081-9c35-1efaad7cb4fd/volumes" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.364586 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62","Type":"ContainerDied","Data":"e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5"} Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.364620 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe","Type":"ContainerStarted","Data":"81434baf854e7a7e4e5ba310cc83eec8a78c224a00665c0f25b5b5ac10eee5b7"} Oct 03 09:02:37 crc kubenswrapper[4810]: E1003 09:02:37.525030 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8357f38e_0c4b_40d3_9c8c_06a5b127d871.slice/crio-conmon-5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ba9c65_630f_4a79_9233_ad8cecbcbf40.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12001d31_156d_4f6e_a242_a53ccacd1473.slice/crio-conmon-7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8357f38e_0c4b_40d3_9c8c_06a5b127d871.slice/crio-1685d82be7357912e0edae2c4754023c5d69c5232f5258f5b95506e786018740\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce94c5f8_8c8a_4081_9c35_1efaad7cb4fd.slice/crio-conmon-f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce94c5f8_8c8a_4081_9c35_1efaad7cb4fd.slice/crio-00c60fb15ba4fc833c87a1c3f46f4d5dbe50173484f92cbc276a587ea91ad033\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb80b5eb_1cc6_485c_a9fe_a5a5821b4b62.slice/crio-conmon-e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12001d31_156d_4f6e_a242_a53ccacd1473.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12d3cfb_2ba7_4eb6_b6b4_bfc4cec93930.slice/crio-conmon-36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8357f38e_0c4b_40d3_9c8c_06a5b127d871.slice/crio-5dfab78655c6f8900be8d173255f4776fb00f05262e8d96b28dc2b0a313eebbe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12001d31_156d_4f6e_a242_a53ccacd1473.slice/crio-7cb1f5ac84eb659a11d6e2474c239ead6554a85d8ef803fa23060f0efea41fc5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12001d31_156d_4f6e_a242_a53ccacd1473.slice/crio-1fef4720445ca3b684e6e5c5ca0b527ab309f85d518bd1933ac2baa9698a3b4a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce94c5f8_8c8a_4081_9c35_1efaad7cb4fd.slice/crio-f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8357f38e_0c4b_40d3_9c8c_06a5b127d871.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12d3cfb_2ba7_4eb6_b6b4_bfc4cec93930.slice/crio-36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973.scope\": RecentStats: unable to find data in memory cache]" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.646046 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.695229 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8g9j\" (UniqueName: \"kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j\") pod \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.695467 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data\") pod \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.695571 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle\") pod \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\" (UID: \"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62\") " Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.702500 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j" (OuterVolumeSpecName: "kube-api-access-b8g9j") pod "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" (UID: "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62"). InnerVolumeSpecName "kube-api-access-b8g9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.721526 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" (UID: "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.723870 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data" (OuterVolumeSpecName: "config-data") pod "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" (UID: "cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.797497 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.797837 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:37 crc kubenswrapper[4810]: I1003 09:02:37.798033 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8g9j\" (UniqueName: \"kubernetes.io/projected/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62-kube-api-access-b8g9j\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.336079 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62","Type":"ContainerDied","Data":"9bf7075e332d831606a48bae71f04fa6b95e31613d1398026e5cad73e3bc9dff"} Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.336110 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.336368 4810 scope.go:117] "RemoveContainer" containerID="e90111a837765bf201207171b2d85049ad565fdc69035e8644e9055cfce4ffc5" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.340698 4810 generic.go:334] "Generic (PLEG): container finished" podID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerID="8c36e2e8cd4f021ac4fec75baa0abcd4e71eebb83a3be8eed45f91852ac6d3b0" exitCode=137 Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.340806 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerDied","Data":"8c36e2e8cd4f021ac4fec75baa0abcd4e71eebb83a3be8eed45f91852ac6d3b0"} Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.340855 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"19cb56ec-a904-4d5c-8489-2396eadf70b2","Type":"ContainerDied","Data":"aebcd41e195e56e26a24b66279c6208e0adb2b176ac325f421ef56f0de6f51da"} Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.340872 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aebcd41e195e56e26a24b66279c6208e0adb2b176ac325f421ef56f0de6f51da" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.343449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe","Type":"ContainerStarted","Data":"ba6cb84bebecb7683aa43366962d518dd88cdaed82bb6fafec8f5209c8020422"} Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.368626 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.368607779 podStartE2EDuration="2.368607779s" podCreationTimestamp="2025-10-03 09:02:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:38.368513156 +0000 UTC m=+7591.795763901" watchObservedRunningTime="2025-10-03 09:02:38.368607779 +0000 UTC m=+7591.795858534" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.375152 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.408182 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cl7h\" (UniqueName: \"kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h\") pod \"19cb56ec-a904-4d5c-8489-2396eadf70b2\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.408250 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle\") pod \"19cb56ec-a904-4d5c-8489-2396eadf70b2\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.408300 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs\") pod \"19cb56ec-a904-4d5c-8489-2396eadf70b2\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.408341 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data\") pod \"19cb56ec-a904-4d5c-8489-2396eadf70b2\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.408366 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs\") pod \"19cb56ec-a904-4d5c-8489-2396eadf70b2\" (UID: \"19cb56ec-a904-4d5c-8489-2396eadf70b2\") " Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.410587 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.410949 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs" (OuterVolumeSpecName: "logs") pod "19cb56ec-a904-4d5c-8489-2396eadf70b2" (UID: "19cb56ec-a904-4d5c-8489-2396eadf70b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.422845 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h" (OuterVolumeSpecName: "kube-api-access-5cl7h") pod "19cb56ec-a904-4d5c-8489-2396eadf70b2" (UID: "19cb56ec-a904-4d5c-8489-2396eadf70b2"). InnerVolumeSpecName "kube-api-access-5cl7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.430082 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.455990 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cb56ec-a904-4d5c-8489-2396eadf70b2" (UID: "19cb56ec-a904-4d5c-8489-2396eadf70b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.456285 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data" (OuterVolumeSpecName: "config-data") pod "19cb56ec-a904-4d5c-8489-2396eadf70b2" (UID: "19cb56ec-a904-4d5c-8489-2396eadf70b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.463571 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:38 crc kubenswrapper[4810]: E1003 09:02:38.463978 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-log" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.463996 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-log" Oct 03 09:02:38 crc kubenswrapper[4810]: E1003 09:02:38.464019 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" containerName="nova-scheduler-scheduler" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464025 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" containerName="nova-scheduler-scheduler" Oct 03 09:02:38 crc kubenswrapper[4810]: E1003 09:02:38.464046 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-metadata" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464052 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-metadata" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464241 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" containerName="nova-scheduler-scheduler" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464270 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-log" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464278 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" containerName="nova-metadata-metadata" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.464919 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.467268 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.476259 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.479065 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "19cb56ec-a904-4d5c-8489-2396eadf70b2" (UID: "19cb56ec-a904-4d5c-8489-2396eadf70b2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511212 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmjxm\" (UniqueName: \"kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511466 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511537 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511551 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cl7h\" (UniqueName: \"kubernetes.io/projected/19cb56ec-a904-4d5c-8489-2396eadf70b2-kube-api-access-5cl7h\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511566 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511578 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19cb56ec-a904-4d5c-8489-2396eadf70b2-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.511588 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cb56ec-a904-4d5c-8489-2396eadf70b2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.613395 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmjxm\" (UniqueName: \"kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.613503 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.613552 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.617368 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.618740 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.633471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmjxm\" (UniqueName: \"kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm\") pod \"nova-scheduler-0\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " pod="openstack/nova-scheduler-0" Oct 03 09:02:38 crc kubenswrapper[4810]: I1003 09:02:38.825654 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.297975 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.325478 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62" path="/var/lib/kubelet/pods/cb80b5eb-1cc6-485c-a9fe-a5a5821b4b62/volumes" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.355836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bafbe49d-ab92-44bf-aba9-d72aab06e528","Type":"ContainerStarted","Data":"a491acd82300a62faaf90e8bd3e138e6afed0e9014e44ca0b47a5804b99f6260"} Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.355857 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.450841 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.470584 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.482710 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.484647 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.487075 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.487276 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.492259 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.546373 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbndn\" (UniqueName: \"kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.546475 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.546523 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.546555 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.546592 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.648836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.649149 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.649730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.649878 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbndn\" (UniqueName: \"kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.650025 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.650514 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.653507 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.653697 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.653914 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.666592 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbndn\" (UniqueName: \"kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn\") pod \"nova-metadata-0\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " pod="openstack/nova-metadata-0" Oct 03 09:02:39 crc kubenswrapper[4810]: I1003 09:02:39.811379 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:40 crc kubenswrapper[4810]: I1003 09:02:40.319436 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:40 crc kubenswrapper[4810]: I1003 09:02:40.367712 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerStarted","Data":"308d0c9c04dc3e6a409bda225f994c389f0ff7c8d418f007dbe35f533fcf3ec8"} Oct 03 09:02:40 crc kubenswrapper[4810]: I1003 09:02:40.371285 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bafbe49d-ab92-44bf-aba9-d72aab06e528","Type":"ContainerStarted","Data":"91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09"} Oct 03 09:02:40 crc kubenswrapper[4810]: I1003 09:02:40.401685 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.401663573 podStartE2EDuration="2.401663573s" podCreationTimestamp="2025-10-03 09:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:40.392606731 +0000 UTC m=+7593.819857466" watchObservedRunningTime="2025-10-03 09:02:40.401663573 +0000 UTC m=+7593.828914308" Oct 03 09:02:41 crc kubenswrapper[4810]: I1003 09:02:41.313601 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cb56ec-a904-4d5c-8489-2396eadf70b2" path="/var/lib/kubelet/pods/19cb56ec-a904-4d5c-8489-2396eadf70b2/volumes" Oct 03 09:02:41 crc kubenswrapper[4810]: I1003 09:02:41.384381 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerStarted","Data":"ec87bc4bd641aeabbf619cd1b5bca548318cb67371930db7ae78db54cb17b90b"} Oct 03 09:02:41 crc kubenswrapper[4810]: I1003 09:02:41.384455 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerStarted","Data":"a9076a758ff7aef766a584d973bb5be65b9283fd5090eea5f462ca0f42d6083f"} Oct 03 09:02:41 crc kubenswrapper[4810]: I1003 09:02:41.413604 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.413582557 podStartE2EDuration="2.413582557s" podCreationTimestamp="2025-10-03 09:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:41.411538412 +0000 UTC m=+7594.838789177" watchObservedRunningTime="2025-10-03 09:02:41.413582557 +0000 UTC m=+7594.840833292" Oct 03 09:02:41 crc kubenswrapper[4810]: I1003 09:02:41.719110 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:43 crc kubenswrapper[4810]: I1003 09:02:43.585197 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:43 crc kubenswrapper[4810]: I1003 09:02:43.626151 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:43 crc kubenswrapper[4810]: I1003 09:02:43.826921 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:02:44 crc kubenswrapper[4810]: I1003 09:02:44.812292 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:44 crc kubenswrapper[4810]: I1003 09:02:44.812402 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:46 crc kubenswrapper[4810]: I1003 09:02:46.719997 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:46 crc kubenswrapper[4810]: I1003 09:02:46.739358 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.467560 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.629223 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-k6kp8"] Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.630497 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.636689 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.636968 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.655000 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-k6kp8"] Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.726082 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.726517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.726667 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtgj\" (UniqueName: \"kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.726712 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: E1003 09:02:47.766235 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ba9c65_630f_4a79_9233_ad8cecbcbf40.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.828403 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtgj\" (UniqueName: \"kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.828476 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.828550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.828613 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.834878 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.834966 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.836789 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.844706 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtgj\" (UniqueName: \"kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj\") pod \"nova-cell1-cell-mapping-k6kp8\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:47 crc kubenswrapper[4810]: I1003 09:02:47.957941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:48 crc kubenswrapper[4810]: I1003 09:02:48.450250 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-k6kp8"] Oct 03 09:02:48 crc kubenswrapper[4810]: I1003 09:02:48.462758 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-k6kp8" event={"ID":"39603956-d605-4c0c-a342-a18f22e572bd","Type":"ContainerStarted","Data":"15571a385650600c69bb054af11b7dd90c7b9a65a7f7e796253c742eaf4fa9b9"} Oct 03 09:02:48 crc kubenswrapper[4810]: I1003 09:02:48.826379 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 09:02:48 crc kubenswrapper[4810]: I1003 09:02:48.861732 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 09:02:49 crc kubenswrapper[4810]: I1003 09:02:49.472534 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-k6kp8" event={"ID":"39603956-d605-4c0c-a342-a18f22e572bd","Type":"ContainerStarted","Data":"ff15e78219b535ad72c83f852eb4e194818599bd9524f809a3c98675ed9023ed"} Oct 03 09:02:49 crc kubenswrapper[4810]: I1003 09:02:49.488550 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-k6kp8" podStartSLOduration=2.4885304059999998 podStartE2EDuration="2.488530406s" podCreationTimestamp="2025-10-03 09:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:49.485728991 +0000 UTC m=+7602.912979726" watchObservedRunningTime="2025-10-03 09:02:49.488530406 +0000 UTC m=+7602.915781141" Oct 03 09:02:49 crc kubenswrapper[4810]: I1003 09:02:49.502106 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 09:02:49 crc kubenswrapper[4810]: I1003 09:02:49.812694 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:02:49 crc kubenswrapper[4810]: I1003 09:02:49.813911 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:02:50 crc kubenswrapper[4810]: I1003 09:02:50.829084 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:50 crc kubenswrapper[4810]: I1003 09:02:50.829084 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.107:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:52 crc kubenswrapper[4810]: I1003 09:02:52.544734 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:02:52 crc kubenswrapper[4810]: I1003 09:02:52.546240 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:02:53 crc kubenswrapper[4810]: I1003 09:02:53.524989 4810 generic.go:334] "Generic (PLEG): container finished" podID="39603956-d605-4c0c-a342-a18f22e572bd" containerID="ff15e78219b535ad72c83f852eb4e194818599bd9524f809a3c98675ed9023ed" exitCode=0 Oct 03 09:02:53 crc kubenswrapper[4810]: I1003 09:02:53.524997 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-k6kp8" event={"ID":"39603956-d605-4c0c-a342-a18f22e572bd","Type":"ContainerDied","Data":"ff15e78219b535ad72c83f852eb4e194818599bd9524f809a3c98675ed9023ed"} Oct 03 09:02:53 crc kubenswrapper[4810]: I1003 09:02:53.627115 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:53 crc kubenswrapper[4810]: I1003 09:02:53.627162 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.104:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.868530 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.892750 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtgj\" (UniqueName: \"kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj\") pod \"39603956-d605-4c0c-a342-a18f22e572bd\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.892980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts\") pod \"39603956-d605-4c0c-a342-a18f22e572bd\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.893038 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle\") pod \"39603956-d605-4c0c-a342-a18f22e572bd\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.893091 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data\") pod \"39603956-d605-4c0c-a342-a18f22e572bd\" (UID: \"39603956-d605-4c0c-a342-a18f22e572bd\") " Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.898033 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts" (OuterVolumeSpecName: "scripts") pod "39603956-d605-4c0c-a342-a18f22e572bd" (UID: "39603956-d605-4c0c-a342-a18f22e572bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.908309 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj" (OuterVolumeSpecName: "kube-api-access-rbtgj") pod "39603956-d605-4c0c-a342-a18f22e572bd" (UID: "39603956-d605-4c0c-a342-a18f22e572bd"). InnerVolumeSpecName "kube-api-access-rbtgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.923826 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data" (OuterVolumeSpecName: "config-data") pod "39603956-d605-4c0c-a342-a18f22e572bd" (UID: "39603956-d605-4c0c-a342-a18f22e572bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.926551 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39603956-d605-4c0c-a342-a18f22e572bd" (UID: "39603956-d605-4c0c-a342-a18f22e572bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.995744 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.995781 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.995791 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39603956-d605-4c0c-a342-a18f22e572bd-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:54 crc kubenswrapper[4810]: I1003 09:02:54.995800 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbtgj\" (UniqueName: \"kubernetes.io/projected/39603956-d605-4c0c-a342-a18f22e572bd-kube-api-access-rbtgj\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.546026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-k6kp8" event={"ID":"39603956-d605-4c0c-a342-a18f22e572bd","Type":"ContainerDied","Data":"15571a385650600c69bb054af11b7dd90c7b9a65a7f7e796253c742eaf4fa9b9"} Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.546070 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15571a385650600c69bb054af11b7dd90c7b9a65a7f7e796253c742eaf4fa9b9" Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.546101 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-k6kp8" Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.763314 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.763636 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" containerID="cri-o://b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b" gracePeriod=30 Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.763673 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" containerID="cri-o://1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b" gracePeriod=30 Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.777051 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.777391 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" containerID="cri-o://91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" gracePeriod=30 Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.854691 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.854993 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-log" containerID="cri-o://a9076a758ff7aef766a584d973bb5be65b9283fd5090eea5f462ca0f42d6083f" gracePeriod=30 Oct 03 09:02:55 crc kubenswrapper[4810]: I1003 09:02:55.855124 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-metadata" containerID="cri-o://ec87bc4bd641aeabbf619cd1b5bca548318cb67371930db7ae78db54cb17b90b" gracePeriod=30 Oct 03 09:02:56 crc kubenswrapper[4810]: I1003 09:02:56.558330 4810 generic.go:334] "Generic (PLEG): container finished" podID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerID="a9076a758ff7aef766a584d973bb5be65b9283fd5090eea5f462ca0f42d6083f" exitCode=143 Oct 03 09:02:56 crc kubenswrapper[4810]: I1003 09:02:56.558557 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerDied","Data":"a9076a758ff7aef766a584d973bb5be65b9283fd5090eea5f462ca0f42d6083f"} Oct 03 09:02:56 crc kubenswrapper[4810]: I1003 09:02:56.560985 4810 generic.go:334] "Generic (PLEG): container finished" podID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerID="b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b" exitCode=143 Oct 03 09:02:56 crc kubenswrapper[4810]: I1003 09:02:56.561018 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerDied","Data":"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b"} Oct 03 09:02:58 crc kubenswrapper[4810]: E1003 09:02:58.000792 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ba9c65_630f_4a79_9233_ad8cecbcbf40.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:02:58 crc kubenswrapper[4810]: E1003 09:02:58.828129 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:02:58 crc kubenswrapper[4810]: E1003 09:02:58.829568 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:02:58 crc kubenswrapper[4810]: E1003 09:02:58.831401 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:02:58 crc kubenswrapper[4810]: E1003 09:02:58.831461 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:03 crc kubenswrapper[4810]: E1003 09:03:03.829517 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:03 crc kubenswrapper[4810]: E1003 09:03:03.833836 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:03 crc kubenswrapper[4810]: E1003 09:03:03.835968 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:03 crc kubenswrapper[4810]: E1003 09:03:03.836036 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:08 crc kubenswrapper[4810]: E1003 09:03:08.829559 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:08 crc kubenswrapper[4810]: E1003 09:03:08.832414 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:08 crc kubenswrapper[4810]: E1003 09:03:08.834876 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:08 crc kubenswrapper[4810]: E1003 09:03:08.835080 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.408168 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod19cb56ec-a904-4d5c-8489-2396eadf70b2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod19cb56ec-a904-4d5c-8489-2396eadf70b2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod19cb56ec_a904_4d5c_8489_2396eadf70b2.slice" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.691511 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.697550 4810 generic.go:334] "Generic (PLEG): container finished" podID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerID="ec87bc4bd641aeabbf619cd1b5bca548318cb67371930db7ae78db54cb17b90b" exitCode=0 Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.697640 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerDied","Data":"ec87bc4bd641aeabbf619cd1b5bca548318cb67371930db7ae78db54cb17b90b"} Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.697674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6370a6f8-04df-4a38-8d70-224197ca97ae","Type":"ContainerDied","Data":"308d0c9c04dc3e6a409bda225f994c389f0ff7c8d418f007dbe35f533fcf3ec8"} Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.697688 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308d0c9c04dc3e6a409bda225f994c389f0ff7c8d418f007dbe35f533fcf3ec8" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.698544 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.699228 4810 generic.go:334] "Generic (PLEG): container finished" podID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerID="1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b" exitCode=0 Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.699260 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerDied","Data":"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b"} Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.699293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f95d95a0-9815-4b27-8274-1d9828c2bb05","Type":"ContainerDied","Data":"5af67ea32b4ed280658f0c34799be88c1c4eaf7241cf80c847ac985bd3cf1b52"} Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.699330 4810 scope.go:117] "RemoveContainer" containerID="1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.699538 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.730094 4810 scope.go:117] "RemoveContainer" containerID="b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.758453 4810 scope.go:117] "RemoveContainer" containerID="1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b" Oct 03 09:03:09 crc kubenswrapper[4810]: E1003 09:03:09.759046 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b\": container with ID starting with 1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b not found: ID does not exist" containerID="1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.759094 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b"} err="failed to get container status \"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b\": rpc error: code = NotFound desc = could not find container \"1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b\": container with ID starting with 1bb3384b2a9b92ee8883a6ef1b53647ce1a7c7c250bc0b92a83dae085c7bcf7b not found: ID does not exist" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.759123 4810 scope.go:117] "RemoveContainer" containerID="b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b" Oct 03 09:03:09 crc kubenswrapper[4810]: E1003 09:03:09.759480 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b\": container with ID starting with b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b not found: ID does not exist" containerID="b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.759518 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b"} err="failed to get container status \"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b\": rpc error: code = NotFound desc = could not find container \"b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b\": container with ID starting with b89b52b128972c9a7094560c800b3b4ccad5bd35b418dbc9277eab7098ccd01b not found: ID does not exist" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791403 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle\") pod \"6370a6f8-04df-4a38-8d70-224197ca97ae\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791478 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbndn\" (UniqueName: \"kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn\") pod \"6370a6f8-04df-4a38-8d70-224197ca97ae\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791510 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs\") pod \"f95d95a0-9815-4b27-8274-1d9828c2bb05\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791568 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data\") pod \"6370a6f8-04df-4a38-8d70-224197ca97ae\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791669 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs\") pod \"6370a6f8-04df-4a38-8d70-224197ca97ae\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791710 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs\") pod \"6370a6f8-04df-4a38-8d70-224197ca97ae\" (UID: \"6370a6f8-04df-4a38-8d70-224197ca97ae\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791802 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data\") pod \"f95d95a0-9815-4b27-8274-1d9828c2bb05\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791823 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle\") pod \"f95d95a0-9815-4b27-8274-1d9828c2bb05\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.791861 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6vgc\" (UniqueName: \"kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc\") pod \"f95d95a0-9815-4b27-8274-1d9828c2bb05\" (UID: \"f95d95a0-9815-4b27-8274-1d9828c2bb05\") " Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.792433 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs" (OuterVolumeSpecName: "logs") pod "6370a6f8-04df-4a38-8d70-224197ca97ae" (UID: "6370a6f8-04df-4a38-8d70-224197ca97ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.792550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs" (OuterVolumeSpecName: "logs") pod "f95d95a0-9815-4b27-8274-1d9828c2bb05" (UID: "f95d95a0-9815-4b27-8274-1d9828c2bb05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.801977 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn" (OuterVolumeSpecName: "kube-api-access-mbndn") pod "6370a6f8-04df-4a38-8d70-224197ca97ae" (UID: "6370a6f8-04df-4a38-8d70-224197ca97ae"). InnerVolumeSpecName "kube-api-access-mbndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.802079 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc" (OuterVolumeSpecName: "kube-api-access-x6vgc") pod "f95d95a0-9815-4b27-8274-1d9828c2bb05" (UID: "f95d95a0-9815-4b27-8274-1d9828c2bb05"). InnerVolumeSpecName "kube-api-access-x6vgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.820331 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data" (OuterVolumeSpecName: "config-data") pod "f95d95a0-9815-4b27-8274-1d9828c2bb05" (UID: "f95d95a0-9815-4b27-8274-1d9828c2bb05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.821278 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data" (OuterVolumeSpecName: "config-data") pod "6370a6f8-04df-4a38-8d70-224197ca97ae" (UID: "6370a6f8-04df-4a38-8d70-224197ca97ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.826696 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6370a6f8-04df-4a38-8d70-224197ca97ae" (UID: "6370a6f8-04df-4a38-8d70-224197ca97ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.830405 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f95d95a0-9815-4b27-8274-1d9828c2bb05" (UID: "f95d95a0-9815-4b27-8274-1d9828c2bb05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.843546 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6370a6f8-04df-4a38-8d70-224197ca97ae" (UID: "6370a6f8-04df-4a38-8d70-224197ca97ae"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894054 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6370a6f8-04df-4a38-8d70-224197ca97ae-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894091 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894105 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894127 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f95d95a0-9815-4b27-8274-1d9828c2bb05-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894141 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6vgc\" (UniqueName: \"kubernetes.io/projected/f95d95a0-9815-4b27-8274-1d9828c2bb05-kube-api-access-x6vgc\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894152 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894165 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbndn\" (UniqueName: \"kubernetes.io/projected/6370a6f8-04df-4a38-8d70-224197ca97ae-kube-api-access-mbndn\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894174 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f95d95a0-9815-4b27-8274-1d9828c2bb05-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:09 crc kubenswrapper[4810]: I1003 09:03:09.894181 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6370a6f8-04df-4a38-8d70-224197ca97ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.033268 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.042401 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065109 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: E1003 09:03:10.065579 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-metadata" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065598 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-metadata" Oct 03 09:03:10 crc kubenswrapper[4810]: E1003 09:03:10.065608 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065614 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" Oct 03 09:03:10 crc kubenswrapper[4810]: E1003 09:03:10.065630 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-log" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065637 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-log" Oct 03 09:03:10 crc kubenswrapper[4810]: E1003 09:03:10.065649 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065655 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" Oct 03 09:03:10 crc kubenswrapper[4810]: E1003 09:03:10.065674 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39603956-d605-4c0c-a342-a18f22e572bd" containerName="nova-manage" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065680 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="39603956-d605-4c0c-a342-a18f22e572bd" containerName="nova-manage" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065857 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="39603956-d605-4c0c-a342-a18f22e572bd" containerName="nova-manage" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065871 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-metadata" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065886 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-log" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065928 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" containerName="nova-metadata-log" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.065949 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" containerName="nova-api-api" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.067053 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.069308 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.075671 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.199203 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.199274 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.199345 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.199535 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wz6\" (UniqueName: \"kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.302380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.302674 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.302812 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.302926 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wz6\" (UniqueName: \"kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.303720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.310650 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.311597 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.322194 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wz6\" (UniqueName: \"kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6\") pod \"nova-api-0\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.396526 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.713744 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.766322 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.781070 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.817039 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.828401 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.833775 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.834348 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.834992 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.857644 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.919229 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrsvz\" (UniqueName: \"kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.919310 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.919396 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.919444 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:10 crc kubenswrapper[4810]: I1003 09:03:10.919482 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023112 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrsvz\" (UniqueName: \"kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023192 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023249 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023286 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023323 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.023867 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.026863 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.027231 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.030582 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.040329 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrsvz\" (UniqueName: \"kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz\") pod \"nova-metadata-0\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.201902 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.320446 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6370a6f8-04df-4a38-8d70-224197ca97ae" path="/var/lib/kubelet/pods/6370a6f8-04df-4a38-8d70-224197ca97ae/volumes" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.321965 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95d95a0-9815-4b27-8274-1d9828c2bb05" path="/var/lib/kubelet/pods/f95d95a0-9815-4b27-8274-1d9828c2bb05/volumes" Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.653953 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:03:11 crc kubenswrapper[4810]: W1003 09:03:11.659112 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod463a9578_5093_4812_a27c_c4d849a5ec67.slice/crio-4214e7f40b81374bd21871ca80061b3198f6b89f231c5f4fdfcc4d7db8cfc465 WatchSource:0}: Error finding container 4214e7f40b81374bd21871ca80061b3198f6b89f231c5f4fdfcc4d7db8cfc465: Status 404 returned error can't find the container with id 4214e7f40b81374bd21871ca80061b3198f6b89f231c5f4fdfcc4d7db8cfc465 Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.725943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerStarted","Data":"1dff7c6ce72064c60c1165570549842d36fbdeb864c3a00efb03073470742ee7"} Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.726004 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerStarted","Data":"dc1c8601928823d7720e50e18c1a8d3c330f97b86b9831f0d95bd971e121dc96"} Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.726018 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerStarted","Data":"f3905aacda123a473347a56ddd4c566abc1d753983c39ae25d7223b4f0cdee86"} Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.727252 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerStarted","Data":"4214e7f40b81374bd21871ca80061b3198f6b89f231c5f4fdfcc4d7db8cfc465"} Oct 03 09:03:11 crc kubenswrapper[4810]: I1003 09:03:11.751370 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.751348876 podStartE2EDuration="1.751348876s" podCreationTimestamp="2025-10-03 09:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:11.744796211 +0000 UTC m=+7625.172046946" watchObservedRunningTime="2025-10-03 09:03:11.751348876 +0000 UTC m=+7625.178599611" Oct 03 09:03:12 crc kubenswrapper[4810]: I1003 09:03:12.739980 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerStarted","Data":"180bad1c50d897c2a6c43df0a1afbfa7d6c5a60011b7aa487c86d10db9b1980d"} Oct 03 09:03:12 crc kubenswrapper[4810]: I1003 09:03:12.741261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerStarted","Data":"c0451198ffb49cf3efcc829fbf4f4b7d23054fd0f095c80c41379ab2c12fc9df"} Oct 03 09:03:12 crc kubenswrapper[4810]: I1003 09:03:12.760303 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7602727700000003 podStartE2EDuration="2.76027277s" podCreationTimestamp="2025-10-03 09:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:12.757270489 +0000 UTC m=+7626.184521224" watchObservedRunningTime="2025-10-03 09:03:12.76027277 +0000 UTC m=+7626.187523545" Oct 03 09:03:13 crc kubenswrapper[4810]: E1003 09:03:13.828826 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:13 crc kubenswrapper[4810]: E1003 09:03:13.831170 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:13 crc kubenswrapper[4810]: E1003 09:03:13.833278 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:13 crc kubenswrapper[4810]: E1003 09:03:13.833332 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:16 crc kubenswrapper[4810]: I1003 09:03:16.202869 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:03:16 crc kubenswrapper[4810]: I1003 09:03:16.202974 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:03:18 crc kubenswrapper[4810]: E1003 09:03:18.830119 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:18 crc kubenswrapper[4810]: E1003 09:03:18.832654 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:18 crc kubenswrapper[4810]: E1003 09:03:18.833972 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:18 crc kubenswrapper[4810]: E1003 09:03:18.834059 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:20 crc kubenswrapper[4810]: I1003 09:03:20.397137 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:03:20 crc kubenswrapper[4810]: I1003 09:03:20.397198 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:03:21 crc kubenswrapper[4810]: I1003 09:03:21.203494 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:03:21 crc kubenswrapper[4810]: I1003 09:03:21.204132 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:03:21 crc kubenswrapper[4810]: I1003 09:03:21.479179 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.109:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:21 crc kubenswrapper[4810]: I1003 09:03:21.479186 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.109:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:22 crc kubenswrapper[4810]: I1003 09:03:22.218255 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:22 crc kubenswrapper[4810]: I1003 09:03:22.218328 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:23 crc kubenswrapper[4810]: E1003 09:03:23.830857 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:23 crc kubenswrapper[4810]: E1003 09:03:23.833648 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:23 crc kubenswrapper[4810]: E1003 09:03:23.835634 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:03:23 crc kubenswrapper[4810]: E1003 09:03:23.835685 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:25 crc kubenswrapper[4810]: I1003 09:03:25.882588 4810 generic.go:334] "Generic (PLEG): container finished" podID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" exitCode=137 Oct 03 09:03:25 crc kubenswrapper[4810]: I1003 09:03:25.882660 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bafbe49d-ab92-44bf-aba9-d72aab06e528","Type":"ContainerDied","Data":"91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09"} Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.220343 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.377914 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle\") pod \"bafbe49d-ab92-44bf-aba9-d72aab06e528\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.378319 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data\") pod \"bafbe49d-ab92-44bf-aba9-d72aab06e528\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.378467 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmjxm\" (UniqueName: \"kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm\") pod \"bafbe49d-ab92-44bf-aba9-d72aab06e528\" (UID: \"bafbe49d-ab92-44bf-aba9-d72aab06e528\") " Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.386584 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm" (OuterVolumeSpecName: "kube-api-access-tmjxm") pod "bafbe49d-ab92-44bf-aba9-d72aab06e528" (UID: "bafbe49d-ab92-44bf-aba9-d72aab06e528"). InnerVolumeSpecName "kube-api-access-tmjxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.413531 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data" (OuterVolumeSpecName: "config-data") pod "bafbe49d-ab92-44bf-aba9-d72aab06e528" (UID: "bafbe49d-ab92-44bf-aba9-d72aab06e528"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.416864 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bafbe49d-ab92-44bf-aba9-d72aab06e528" (UID: "bafbe49d-ab92-44bf-aba9-d72aab06e528"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.481342 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.481389 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bafbe49d-ab92-44bf-aba9-d72aab06e528-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.481402 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmjxm\" (UniqueName: \"kubernetes.io/projected/bafbe49d-ab92-44bf-aba9-d72aab06e528-kube-api-access-tmjxm\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.895576 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bafbe49d-ab92-44bf-aba9-d72aab06e528","Type":"ContainerDied","Data":"a491acd82300a62faaf90e8bd3e138e6afed0e9014e44ca0b47a5804b99f6260"} Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.895644 4810 scope.go:117] "RemoveContainer" containerID="91d88e761fe7d26b16241edf5a6ed5485279c7f7eaa4dbd360b5f84ce9846d09" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.895642 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.943748 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.955024 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.968306 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:03:26 crc kubenswrapper[4810]: E1003 09:03:26.969330 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.969477 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.969965 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" containerName="nova-scheduler-scheduler" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.971115 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.973519 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:03:26 crc kubenswrapper[4810]: I1003 09:03:26.975646 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.093692 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.093776 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22tll\" (UniqueName: \"kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.093915 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.195653 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.195738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22tll\" (UniqueName: \"kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.195817 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.199459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.199485 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.221368 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22tll\" (UniqueName: \"kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll\") pod \"nova-scheduler-0\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.290558 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.314933 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafbe49d-ab92-44bf-aba9-d72aab06e528" path="/var/lib/kubelet/pods/bafbe49d-ab92-44bf-aba9-d72aab06e528/volumes" Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.721416 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:03:27 crc kubenswrapper[4810]: W1003 09:03:27.726436 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ba8eff_5520_4703_911a_992a05fc8bdc.slice/crio-29e51125705b55d6c153d6179c026556a59748c93650281561316eba48e19e72 WatchSource:0}: Error finding container 29e51125705b55d6c153d6179c026556a59748c93650281561316eba48e19e72: Status 404 returned error can't find the container with id 29e51125705b55d6c153d6179c026556a59748c93650281561316eba48e19e72 Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.920544 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1ba8eff-5520-4703-911a-992a05fc8bdc","Type":"ContainerStarted","Data":"29e51125705b55d6c153d6179c026556a59748c93650281561316eba48e19e72"} Oct 03 09:03:27 crc kubenswrapper[4810]: I1003 09:03:27.953707 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9536837839999999 podStartE2EDuration="1.953683784s" podCreationTimestamp="2025-10-03 09:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:27.944288263 +0000 UTC m=+7641.371538998" watchObservedRunningTime="2025-10-03 09:03:27.953683784 +0000 UTC m=+7641.380934539" Oct 03 09:03:28 crc kubenswrapper[4810]: I1003 09:03:28.955844 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1ba8eff-5520-4703-911a-992a05fc8bdc","Type":"ContainerStarted","Data":"6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834"} Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.400826 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.401702 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.404235 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.404624 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.979119 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:03:30 crc kubenswrapper[4810]: I1003 09:03:30.984235 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.199554 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.201154 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.213834 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.215054 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.225538 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.229756 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.297773 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.298422 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8knc\" (UniqueName: \"kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.298933 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.299011 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.299147 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.402505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.402760 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.402998 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.403376 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.403431 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8knc\" (UniqueName: \"kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.403513 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.404275 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.404759 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.404796 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.430295 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8knc\" (UniqueName: \"kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc\") pod \"dnsmasq-dns-844fb94945-h8dp6\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.523477 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.812949 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:03:31 crc kubenswrapper[4810]: I1003 09:03:31.992865 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" event={"ID":"2246ca28-9b3f-40d7-8d8c-4feb750d5d41","Type":"ContainerStarted","Data":"a2ed6591046b98e0fc64eb8d188ab68ab0e1381294b139a1117c13cdb1624c76"} Oct 03 09:03:32 crc kubenswrapper[4810]: I1003 09:03:32.000001 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:03:32 crc kubenswrapper[4810]: I1003 09:03:32.291309 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:03:33 crc kubenswrapper[4810]: I1003 09:03:33.003923 4810 generic.go:334] "Generic (PLEG): container finished" podID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerID="82c2b517a15956667e2ff7d96e00a8894f23fd7255a9bfd77aa7585c60994331" exitCode=0 Oct 03 09:03:33 crc kubenswrapper[4810]: I1003 09:03:33.004027 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" event={"ID":"2246ca28-9b3f-40d7-8d8c-4feb750d5d41","Type":"ContainerDied","Data":"82c2b517a15956667e2ff7d96e00a8894f23fd7255a9bfd77aa7585c60994331"} Oct 03 09:03:34 crc kubenswrapper[4810]: I1003 09:03:34.019012 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" event={"ID":"2246ca28-9b3f-40d7-8d8c-4feb750d5d41","Type":"ContainerStarted","Data":"b15c68365da2964e4b50c0011f035005f70603978d31e1076761b396d98183c1"} Oct 03 09:03:34 crc kubenswrapper[4810]: I1003 09:03:34.047889 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" podStartSLOduration=3.047868365 podStartE2EDuration="3.047868365s" podCreationTimestamp="2025-10-03 09:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:34.04656186 +0000 UTC m=+7647.473812595" watchObservedRunningTime="2025-10-03 09:03:34.047868365 +0000 UTC m=+7647.475119100" Oct 03 09:03:34 crc kubenswrapper[4810]: I1003 09:03:34.811844 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:34 crc kubenswrapper[4810]: I1003 09:03:34.812179 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-log" containerID="cri-o://dc1c8601928823d7720e50e18c1a8d3c330f97b86b9831f0d95bd971e121dc96" gracePeriod=30 Oct 03 09:03:34 crc kubenswrapper[4810]: I1003 09:03:34.812258 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-api" containerID="cri-o://1dff7c6ce72064c60c1165570549842d36fbdeb864c3a00efb03073470742ee7" gracePeriod=30 Oct 03 09:03:35 crc kubenswrapper[4810]: I1003 09:03:35.034687 4810 generic.go:334] "Generic (PLEG): container finished" podID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerID="dc1c8601928823d7720e50e18c1a8d3c330f97b86b9831f0d95bd971e121dc96" exitCode=143 Oct 03 09:03:35 crc kubenswrapper[4810]: I1003 09:03:35.034883 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerDied","Data":"dc1c8601928823d7720e50e18c1a8d3c330f97b86b9831f0d95bd971e121dc96"} Oct 03 09:03:35 crc kubenswrapper[4810]: I1003 09:03:35.036283 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:37 crc kubenswrapper[4810]: I1003 09:03:37.290823 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 09:03:37 crc kubenswrapper[4810]: I1003 09:03:37.319520 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.078816 4810 generic.go:334] "Generic (PLEG): container finished" podID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerID="1dff7c6ce72064c60c1165570549842d36fbdeb864c3a00efb03073470742ee7" exitCode=0 Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.078870 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerDied","Data":"1dff7c6ce72064c60c1165570549842d36fbdeb864c3a00efb03073470742ee7"} Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.108561 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.365623 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.542717 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wz6\" (UniqueName: \"kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6\") pod \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.542790 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs\") pod \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.542831 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data\") pod \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.542878 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle\") pod \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\" (UID: \"4cb253ae-4e56-42bf-ae37-0daf9b5af462\") " Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.543273 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs" (OuterVolumeSpecName: "logs") pod "4cb253ae-4e56-42bf-ae37-0daf9b5af462" (UID: "4cb253ae-4e56-42bf-ae37-0daf9b5af462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.543632 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cb253ae-4e56-42bf-ae37-0daf9b5af462-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.557189 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6" (OuterVolumeSpecName: "kube-api-access-j4wz6") pod "4cb253ae-4e56-42bf-ae37-0daf9b5af462" (UID: "4cb253ae-4e56-42bf-ae37-0daf9b5af462"). InnerVolumeSpecName "kube-api-access-j4wz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.578701 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cb253ae-4e56-42bf-ae37-0daf9b5af462" (UID: "4cb253ae-4e56-42bf-ae37-0daf9b5af462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.582763 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data" (OuterVolumeSpecName: "config-data") pod "4cb253ae-4e56-42bf-ae37-0daf9b5af462" (UID: "4cb253ae-4e56-42bf-ae37-0daf9b5af462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.645350 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wz6\" (UniqueName: \"kubernetes.io/projected/4cb253ae-4e56-42bf-ae37-0daf9b5af462-kube-api-access-j4wz6\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.645395 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:38 crc kubenswrapper[4810]: I1003 09:03:38.645410 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cb253ae-4e56-42bf-ae37-0daf9b5af462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.108005 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.108948 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4cb253ae-4e56-42bf-ae37-0daf9b5af462","Type":"ContainerDied","Data":"f3905aacda123a473347a56ddd4c566abc1d753983c39ae25d7223b4f0cdee86"} Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.108998 4810 scope.go:117] "RemoveContainer" containerID="1dff7c6ce72064c60c1165570549842d36fbdeb864c3a00efb03073470742ee7" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.141640 4810 scope.go:117] "RemoveContainer" containerID="dc1c8601928823d7720e50e18c1a8d3c330f97b86b9831f0d95bd971e121dc96" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.146768 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.155662 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.172253 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:39 crc kubenswrapper[4810]: E1003 09:03:39.173084 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-api" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.173103 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-api" Oct 03 09:03:39 crc kubenswrapper[4810]: E1003 09:03:39.173122 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-log" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.173128 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-log" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.173315 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-api" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.173332 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" containerName="nova-api-log" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.174501 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.179751 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.180147 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.181961 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.182373 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.257796 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.257869 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.257910 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.257969 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9gp\" (UniqueName: \"kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.258030 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.258070 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.315027 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cb253ae-4e56-42bf-ae37-0daf9b5af462" path="/var/lib/kubelet/pods/4cb253ae-4e56-42bf-ae37-0daf9b5af462/volumes" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359372 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359428 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359495 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.359577 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9gp\" (UniqueName: \"kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.360605 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.364674 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.364990 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.367469 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.369258 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.377216 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9gp\" (UniqueName: \"kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp\") pod \"nova-api-0\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.493843 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:03:39 crc kubenswrapper[4810]: I1003 09:03:39.934171 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:03:40 crc kubenswrapper[4810]: I1003 09:03:40.124085 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerStarted","Data":"28e7e44846cdd8453c36b8be13314a2d7a56d1b1d0b054b8b1ad8927942162af"} Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.136067 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerStarted","Data":"a51f1a244e80ed64834a6d007023e154a4199448c67d8f2e29312bbb25ff2f8c"} Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.136430 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerStarted","Data":"4458f484a54244dae918accd79a7327d261f442db2d6d5731024ddb8dc509061"} Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.157812 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.157789812 podStartE2EDuration="2.157789812s" podCreationTimestamp="2025-10-03 09:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:41.157415203 +0000 UTC m=+7654.584665958" watchObservedRunningTime="2025-10-03 09:03:41.157789812 +0000 UTC m=+7654.585040547" Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.525104 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.588081 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:03:41 crc kubenswrapper[4810]: I1003 09:03:41.589008 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="dnsmasq-dns" containerID="cri-o://dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466" gracePeriod=10 Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.054872 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.115377 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config\") pod \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.115454 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb\") pod \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.115501 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vr7p\" (UniqueName: \"kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p\") pod \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.115542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc\") pod \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.115675 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb\") pod \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\" (UID: \"627c0a25-d0c9-4b90-beeb-de5398d1fb7c\") " Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.122265 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p" (OuterVolumeSpecName: "kube-api-access-7vr7p") pod "627c0a25-d0c9-4b90-beeb-de5398d1fb7c" (UID: "627c0a25-d0c9-4b90-beeb-de5398d1fb7c"). InnerVolumeSpecName "kube-api-access-7vr7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.159754 4810 generic.go:334] "Generic (PLEG): container finished" podID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerID="dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466" exitCode=0 Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.161741 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" event={"ID":"627c0a25-d0c9-4b90-beeb-de5398d1fb7c","Type":"ContainerDied","Data":"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466"} Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.162008 4810 scope.go:117] "RemoveContainer" containerID="dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.162630 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.161810 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67f978bc79-wq9xp" event={"ID":"627c0a25-d0c9-4b90-beeb-de5398d1fb7c","Type":"ContainerDied","Data":"ac2b13d2de655fad339b80849e016fdb119532f508221bbcb0054deeed91bdcd"} Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.171389 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "627c0a25-d0c9-4b90-beeb-de5398d1fb7c" (UID: "627c0a25-d0c9-4b90-beeb-de5398d1fb7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.201410 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config" (OuterVolumeSpecName: "config") pod "627c0a25-d0c9-4b90-beeb-de5398d1fb7c" (UID: "627c0a25-d0c9-4b90-beeb-de5398d1fb7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.204813 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "627c0a25-d0c9-4b90-beeb-de5398d1fb7c" (UID: "627c0a25-d0c9-4b90-beeb-de5398d1fb7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.218298 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.218334 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.218345 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vr7p\" (UniqueName: \"kubernetes.io/projected/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-kube-api-access-7vr7p\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.218355 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.219968 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "627c0a25-d0c9-4b90-beeb-de5398d1fb7c" (UID: "627c0a25-d0c9-4b90-beeb-de5398d1fb7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.237231 4810 scope.go:117] "RemoveContainer" containerID="e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.263441 4810 scope.go:117] "RemoveContainer" containerID="dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466" Oct 03 09:03:42 crc kubenswrapper[4810]: E1003 09:03:42.263880 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466\": container with ID starting with dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466 not found: ID does not exist" containerID="dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.263942 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466"} err="failed to get container status \"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466\": rpc error: code = NotFound desc = could not find container \"dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466\": container with ID starting with dae903aa76f9d32914e264a577cc6427cd4aa52d66a19868927312526a827466 not found: ID does not exist" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.263968 4810 scope.go:117] "RemoveContainer" containerID="e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3" Oct 03 09:03:42 crc kubenswrapper[4810]: E1003 09:03:42.264330 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3\": container with ID starting with e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3 not found: ID does not exist" containerID="e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.264368 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3"} err="failed to get container status \"e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3\": rpc error: code = NotFound desc = could not find container \"e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3\": container with ID starting with e6036ebe296121b3a5c4fc0f383a8d1c01a211c54260c67e8d0b7c1431252ba3 not found: ID does not exist" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.320209 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/627c0a25-d0c9-4b90-beeb-de5398d1fb7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.494227 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:03:42 crc kubenswrapper[4810]: I1003 09:03:42.503212 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67f978bc79-wq9xp"] Oct 03 09:03:43 crc kubenswrapper[4810]: I1003 09:03:43.325140 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" path="/var/lib/kubelet/pods/627c0a25-d0c9-4b90-beeb-de5398d1fb7c/volumes" Oct 03 09:03:49 crc kubenswrapper[4810]: I1003 09:03:49.495068 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:03:49 crc kubenswrapper[4810]: I1003 09:03:49.495715 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:03:50 crc kubenswrapper[4810]: I1003 09:03:50.507102 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.113:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:50 crc kubenswrapper[4810]: I1003 09:03:50.507331 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.113:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:03:59 crc kubenswrapper[4810]: I1003 09:03:59.501707 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:03:59 crc kubenswrapper[4810]: I1003 09:03:59.502605 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:03:59 crc kubenswrapper[4810]: I1003 09:03:59.502701 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:03:59 crc kubenswrapper[4810]: I1003 09:03:59.508632 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:04:00 crc kubenswrapper[4810]: I1003 09:04:00.046158 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-b42pc"] Oct 03 09:04:00 crc kubenswrapper[4810]: I1003 09:04:00.055549 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-b42pc"] Oct 03 09:04:00 crc kubenswrapper[4810]: I1003 09:04:00.325504 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:04:00 crc kubenswrapper[4810]: I1003 09:04:00.333997 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:04:01 crc kubenswrapper[4810]: I1003 09:04:01.317251 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3af9106-156d-42a4-a177-4a104952b1f0" path="/var/lib/kubelet/pods/d3af9106-156d-42a4-a177-4a104952b1f0/volumes" Oct 03 09:04:10 crc kubenswrapper[4810]: I1003 09:04:10.034046 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-6011-account-create-6zn2n"] Oct 03 09:04:10 crc kubenswrapper[4810]: I1003 09:04:10.043850 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-6011-account-create-6zn2n"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.313753 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2896c9-43c9-439b-ac75-88a33a0928be" path="/var/lib/kubelet/pods/0c2896c9-43c9-439b-ac75-88a33a0928be/volumes" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.735022 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:11 crc kubenswrapper[4810]: E1003 09:04:11.736412 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="init" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.736442 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="init" Oct 03 09:04:11 crc kubenswrapper[4810]: E1003 09:04:11.736537 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="dnsmasq-dns" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.736552 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="dnsmasq-dns" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.736885 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="627c0a25-d0c9-4b90-beeb-de5398d1fb7c" containerName="dnsmasq-dns" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.738439 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.742682 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.742915 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.744405 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.745581 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mmw9j" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.750087 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.796258 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.796579 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-log" containerID="cri-o://503d86e98c9406e6b5448ab44f0b4b03255e58a9694d523acb5adcf474471755" gracePeriod=30 Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.797216 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-httpd" containerID="cri-o://627170b69762133e49cc01416448d4326d9f790d64b22049e882b367a2baf91c" gracePeriod=30 Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.832131 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.832204 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.832306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.832415 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jvhk\" (UniqueName: \"kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.832506 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.863005 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.864784 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.879104 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.879420 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-log" containerID="cri-o://273e1df6ad2e134547d4a271c43908bec4a5fac4bedce1850f5c497cd1349951" gracePeriod=30 Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.879511 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-httpd" containerID="cri-o://04289b71777a3d321c84f88730195ef7e5009fbba18c93816fd6e2925f8793df" gracePeriod=30 Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.897087 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.933748 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.933837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.933873 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.933941 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.934032 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jvhk\" (UniqueName: \"kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.934779 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.935411 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.935992 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.949124 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:11 crc kubenswrapper[4810]: I1003 09:04:11.956586 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jvhk\" (UniqueName: \"kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk\") pod \"horizon-76bdc5d745-bxfkf\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.037722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhtlr\" (UniqueName: \"kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.037830 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.038767 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.038852 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.038887 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.072036 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.140986 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhtlr\" (UniqueName: \"kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.141347 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.141438 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.141484 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.141509 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.142179 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.143914 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.143921 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.153738 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.162509 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhtlr\" (UniqueName: \"kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr\") pod \"horizon-6687d64989-wxhgg\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.193619 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.476419 4810 generic.go:334] "Generic (PLEG): container finished" podID="a116c862-90e2-44c5-834f-e273910f1297" containerID="503d86e98c9406e6b5448ab44f0b4b03255e58a9694d523acb5adcf474471755" exitCode=143 Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.476532 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerDied","Data":"503d86e98c9406e6b5448ab44f0b4b03255e58a9694d523acb5adcf474471755"} Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.484167 4810 generic.go:334] "Generic (PLEG): container finished" podID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerID="273e1df6ad2e134547d4a271c43908bec4a5fac4bedce1850f5c497cd1349951" exitCode=143 Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.484219 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerDied","Data":"273e1df6ad2e134547d4a271c43908bec4a5fac4bedce1850f5c497cd1349951"} Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.542068 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:12 crc kubenswrapper[4810]: I1003 09:04:12.695772 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:12 crc kubenswrapper[4810]: W1003 09:04:12.699823 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod394bf15b_cd2f_4486_b716_bd07838fbd94.slice/crio-8359bafd7d36f65f1b80b2570e99a9df73c465500cc1e46cbf47225741900e10 WatchSource:0}: Error finding container 8359bafd7d36f65f1b80b2570e99a9df73c465500cc1e46cbf47225741900e10: Status 404 returned error can't find the container with id 8359bafd7d36f65f1b80b2570e99a9df73c465500cc1e46cbf47225741900e10 Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.389321 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.432778 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.434692 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.437215 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.458550 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.482608 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6f9z\" (UniqueName: \"kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.483959 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.484080 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.484158 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.484254 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.484338 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.484480 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.501083 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerStarted","Data":"d751683e370625d3f5d5689997632817bb3d46399fc4e91e473172a4f5c392ea"} Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.502295 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerStarted","Data":"8359bafd7d36f65f1b80b2570e99a9df73c465500cc1e46cbf47225741900e10"} Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.529510 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.567396 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.569678 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.586921 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588263 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6f9z\" (UniqueName: \"kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588322 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588343 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588368 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588401 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.588422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.589061 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.589741 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.590622 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.592111 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.597215 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.621009 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.621601 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.634802 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6f9z\" (UniqueName: \"kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z\") pod \"horizon-6c76f5b976-p66q9\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.690763 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.690862 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.690930 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfnhn\" (UniqueName: \"kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.691001 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.691069 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.691104 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.691127 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.775816 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.793415 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.793809 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfnhn\" (UniqueName: \"kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.793871 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.794129 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.794170 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.794193 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.794260 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.794579 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.795602 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.796356 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.810013 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.811390 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfnhn\" (UniqueName: \"kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.811837 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.815362 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs\") pod \"horizon-95fbdc968-9z87d\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:13 crc kubenswrapper[4810]: I1003 09:04:13.894493 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:14 crc kubenswrapper[4810]: I1003 09:04:14.292578 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:04:14 crc kubenswrapper[4810]: W1003 09:04:14.306289 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f271d78_0084_43b6_9e2b_f8f19b515f70.slice/crio-0e9351d15111606ff8d6a8c9ea0944363add4ccc353058884b403a7f8badf32d WatchSource:0}: Error finding container 0e9351d15111606ff8d6a8c9ea0944363add4ccc353058884b403a7f8badf32d: Status 404 returned error can't find the container with id 0e9351d15111606ff8d6a8c9ea0944363add4ccc353058884b403a7f8badf32d Oct 03 09:04:14 crc kubenswrapper[4810]: I1003 09:04:14.384156 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:04:14 crc kubenswrapper[4810]: W1003 09:04:14.389839 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ac00941_8b00_47af_9183_b7578731f8eb.slice/crio-aab253c2eedc274ca536239b0930ccdf45476f0d1aa0ea8cc319156fe04e8d8b WatchSource:0}: Error finding container aab253c2eedc274ca536239b0930ccdf45476f0d1aa0ea8cc319156fe04e8d8b: Status 404 returned error can't find the container with id aab253c2eedc274ca536239b0930ccdf45476f0d1aa0ea8cc319156fe04e8d8b Oct 03 09:04:14 crc kubenswrapper[4810]: I1003 09:04:14.515811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerStarted","Data":"aab253c2eedc274ca536239b0930ccdf45476f0d1aa0ea8cc319156fe04e8d8b"} Oct 03 09:04:14 crc kubenswrapper[4810]: I1003 09:04:14.517273 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerStarted","Data":"0e9351d15111606ff8d6a8c9ea0944363add4ccc353058884b403a7f8badf32d"} Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.037567 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.1.78:9292/healthcheck\": read tcp 10.217.0.2:55476->10.217.1.78:9292: read: connection reset by peer" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.039794 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.1.78:9292/healthcheck\": read tcp 10.217.0.2:55464->10.217.1.78:9292: read: connection reset by peer" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.545112 4810 generic.go:334] "Generic (PLEG): container finished" podID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerID="04289b71777a3d321c84f88730195ef7e5009fbba18c93816fd6e2925f8793df" exitCode=0 Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.545221 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerDied","Data":"04289b71777a3d321c84f88730195ef7e5009fbba18c93816fd6e2925f8793df"} Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.549847 4810 generic.go:334] "Generic (PLEG): container finished" podID="a116c862-90e2-44c5-834f-e273910f1297" containerID="627170b69762133e49cc01416448d4326d9f790d64b22049e882b367a2baf91c" exitCode=0 Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.549884 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerDied","Data":"627170b69762133e49cc01416448d4326d9f790d64b22049e882b367a2baf91c"} Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.633257 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.707025 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.744889 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n8cp\" (UniqueName: \"kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745087 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745120 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745154 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745198 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745232 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.745309 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle\") pod \"a116c862-90e2-44c5-834f-e273910f1297\" (UID: \"a116c862-90e2-44c5-834f-e273910f1297\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.749690 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.749758 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs" (OuterVolumeSpecName: "logs") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.754284 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp" (OuterVolumeSpecName: "kube-api-access-6n8cp") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "kube-api-access-6n8cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.762739 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts" (OuterVolumeSpecName: "scripts") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.798310 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.838146 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data" (OuterVolumeSpecName: "config-data") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.839882 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a116c862-90e2-44c5-834f-e273910f1297" (UID: "a116c862-90e2-44c5-834f-e273910f1297"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.847811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.847980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.848035 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.848081 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj27x\" (UniqueName: \"kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.848124 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.848164 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.848219 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data\") pod \"6b45c597-49f0-41b4-9b88-250779a49f4e\" (UID: \"6b45c597-49f0-41b4-9b88-250779a49f4e\") " Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.849414 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.849719 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs" (OuterVolumeSpecName: "logs") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852239 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n8cp\" (UniqueName: \"kubernetes.io/projected/a116c862-90e2-44c5-834f-e273910f1297-kube-api-access-6n8cp\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852273 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852289 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852300 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a116c862-90e2-44c5-834f-e273910f1297-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852311 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852321 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.852330 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a116c862-90e2-44c5-834f-e273910f1297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.853297 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x" (OuterVolumeSpecName: "kube-api-access-mj27x") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "kube-api-access-mj27x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.858453 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts" (OuterVolumeSpecName: "scripts") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.877146 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.908805 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data" (OuterVolumeSpecName: "config-data") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.917307 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b45c597-49f0-41b4-9b88-250779a49f4e" (UID: "6b45c597-49f0-41b4-9b88-250779a49f4e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954347 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954398 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj27x\" (UniqueName: \"kubernetes.io/projected/6b45c597-49f0-41b4-9b88-250779a49f4e-kube-api-access-mj27x\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954413 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954425 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954438 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954450 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b45c597-49f0-41b4-9b88-250779a49f4e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:15 crc kubenswrapper[4810]: I1003 09:04:15.954465 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b45c597-49f0-41b4-9b88-250779a49f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.574707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a116c862-90e2-44c5-834f-e273910f1297","Type":"ContainerDied","Data":"bf2216e0769a6770980ba2269e212db32a76d17921283f2e2428ff394316c1bb"} Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.575140 4810 scope.go:117] "RemoveContainer" containerID="627170b69762133e49cc01416448d4326d9f790d64b22049e882b367a2baf91c" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.574736 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.583830 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b45c597-49f0-41b4-9b88-250779a49f4e","Type":"ContainerDied","Data":"da6d36cfa7b7b637f4e73ae57423fd756614536020cb3a1b0901b43424f13fad"} Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.583989 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.636035 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.663476 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.676269 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.691298 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.705722 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: E1003 09:04:16.706242 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706268 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: E1003 09:04:16.706301 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706313 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: E1003 09:04:16.706342 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706353 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: E1003 09:04:16.706368 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706376 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706610 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706642 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706659 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a116c862-90e2-44c5-834f-e273910f1297" containerName="glance-log" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.706680 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" containerName="glance-httpd" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.708201 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.710114 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.710356 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.710867 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.711083 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4b2nh" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.721646 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.735633 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.737349 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.745364 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.745450 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.745523 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896205 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896293 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896364 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896454 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896491 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896520 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2rh\" (UniqueName: \"kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896622 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-846jc\" (UniqueName: \"kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896669 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896722 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896760 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896796 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896868 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:16 crc kubenswrapper[4810]: I1003 09:04:16.896931 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.002288 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.002550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.002692 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.002748 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.002920 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.003693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.003920 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.003978 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.004413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.004830 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.004995 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005033 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005083 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005137 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2rh\" (UniqueName: \"kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005181 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-846jc\" (UniqueName: \"kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005241 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.005836 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.006350 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.007767 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.007861 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.009102 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.018821 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.019249 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.019473 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.020539 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.020951 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.024981 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-846jc\" (UniqueName: \"kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc\") pod \"glance-default-internal-api-0\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.026299 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2rh\" (UniqueName: \"kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh\") pod \"glance-default-external-api-0\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.031021 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.070965 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.252790 4810 scope.go:117] "RemoveContainer" containerID="4482b834e96ae5d1998f4490aac21c902880d5d6231a7c4e7cd7109e90397dd2" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.322983 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b45c597-49f0-41b4-9b88-250779a49f4e" path="/var/lib/kubelet/pods/6b45c597-49f0-41b4-9b88-250779a49f4e/volumes" Oct 03 09:04:17 crc kubenswrapper[4810]: I1003 09:04:17.324061 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a116c862-90e2-44c5-834f-e273910f1297" path="/var/lib/kubelet/pods/a116c862-90e2-44c5-834f-e273910f1297/volumes" Oct 03 09:04:21 crc kubenswrapper[4810]: I1003 09:04:21.035275 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-96fxf"] Oct 03 09:04:21 crc kubenswrapper[4810]: I1003 09:04:21.049237 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-96fxf"] Oct 03 09:04:21 crc kubenswrapper[4810]: I1003 09:04:21.317555 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a4700d-8b62-43c3-9dee-6971f3c59399" path="/var/lib/kubelet/pods/24a4700d-8b62-43c3-9dee-6971f3c59399/volumes" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.361649 4810 scope.go:117] "RemoveContainer" containerID="503d86e98c9406e6b5448ab44f0b4b03255e58a9694d523acb5adcf474471755" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.387809 4810 scope.go:117] "RemoveContainer" containerID="b52673607d02a31be3ec875e91f22569829daf5bed9fb51893d9141467beef9b" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.431083 4810 scope.go:117] "RemoveContainer" containerID="04289b71777a3d321c84f88730195ef7e5009fbba18c93816fd6e2925f8793df" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.449604 4810 scope.go:117] "RemoveContainer" containerID="ae6412e881c90e54d9b1b29cff15387b52b4b7f40e6109b57f47fef1ee661338" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.600288 4810 scope.go:117] "RemoveContainer" containerID="273e1df6ad2e134547d4a271c43908bec4a5fac4bedce1850f5c497cd1349951" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.640970 4810 scope.go:117] "RemoveContainer" containerID="5155a216c214c324bde39971c5a142849a558759dc4cd64c80fe40ac6bc1f59c" Oct 03 09:04:22 crc kubenswrapper[4810]: I1003 09:04:22.705043 4810 scope.go:117] "RemoveContainer" containerID="ffc55cc6f13bda534a3bddbd2bbd6967e5152638c775c3a6f3127783f9fc9c4a" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.072032 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:04:23 crc kubenswrapper[4810]: W1003 09:04:23.076211 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1d96d1_04f9_4dd6_aabc_5a55b291d47b.slice/crio-65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3 WatchSource:0}: Error finding container 65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3: Status 404 returned error can't find the container with id 65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.154341 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:04:23 crc kubenswrapper[4810]: W1003 09:04:23.165785 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad2fe8b2_c0a4_4839_bd1c_18e665a881cf.slice/crio-0f714d9ba4d18c319d6c7a89a36f7f6a124b1a686c26f0111a661be7af6a8c27 WatchSource:0}: Error finding container 0f714d9ba4d18c319d6c7a89a36f7f6a124b1a686c26f0111a661be7af6a8c27: Status 404 returned error can't find the container with id 0f714d9ba4d18c319d6c7a89a36f7f6a124b1a686c26f0111a661be7af6a8c27 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.702315 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerStarted","Data":"bb311d589e30cb8df304e92dd98f6340e3b06f32cce70a6cf9a62c55160415fa"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.702675 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerStarted","Data":"b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.707207 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerStarted","Data":"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.707256 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerStarted","Data":"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.710819 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerStarted","Data":"66ef4d4e7ae57b0ef21606dd16dd6adf54e3e40157325be2f254760143a70fbc"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.710885 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerStarted","Data":"1629dc12af9c0f2e01990d353d4f360f13469e2efdb1336029e6ff6a8a1f8e92"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.710959 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76bdc5d745-bxfkf" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon" containerID="cri-o://66ef4d4e7ae57b0ef21606dd16dd6adf54e3e40157325be2f254760143a70fbc" gracePeriod=30 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.710943 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76bdc5d745-bxfkf" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon-log" containerID="cri-o://1629dc12af9c0f2e01990d353d4f360f13469e2efdb1336029e6ff6a8a1f8e92" gracePeriod=30 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.714839 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerStarted","Data":"0f714d9ba4d18c319d6c7a89a36f7f6a124b1a686c26f0111a661be7af6a8c27"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.718065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerStarted","Data":"092f7819a3a6983512c43db948fac849191287700f24b1a0363b14dfa88ce8b3"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.718111 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerStarted","Data":"fae848c877c87a4971d7b95f2f185c27a248656f5d40251c38e2da8af65fe518"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.718193 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6687d64989-wxhgg" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon-log" containerID="cri-o://fae848c877c87a4971d7b95f2f185c27a248656f5d40251c38e2da8af65fe518" gracePeriod=30 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.718209 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6687d64989-wxhgg" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon" containerID="cri-o://092f7819a3a6983512c43db948fac849191287700f24b1a0363b14dfa88ce8b3" gracePeriod=30 Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.725707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerStarted","Data":"65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3"} Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.727859 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c76f5b976-p66q9" podStartSLOduration=2.289585481 podStartE2EDuration="10.727835606s" podCreationTimestamp="2025-10-03 09:04:13 +0000 UTC" firstStartedPulling="2025-10-03 09:04:14.311319306 +0000 UTC m=+7687.738570041" lastFinishedPulling="2025-10-03 09:04:22.749569431 +0000 UTC m=+7696.176820166" observedRunningTime="2025-10-03 09:04:23.719810652 +0000 UTC m=+7697.147061397" watchObservedRunningTime="2025-10-03 09:04:23.727835606 +0000 UTC m=+7697.155086341" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.764306 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76bdc5d745-bxfkf" podStartSLOduration=2.6038908960000002 podStartE2EDuration="12.764280859s" podCreationTimestamp="2025-10-03 09:04:11 +0000 UTC" firstStartedPulling="2025-10-03 09:04:12.544694929 +0000 UTC m=+7685.971945664" lastFinishedPulling="2025-10-03 09:04:22.705084892 +0000 UTC m=+7696.132335627" observedRunningTime="2025-10-03 09:04:23.750512412 +0000 UTC m=+7697.177763157" watchObservedRunningTime="2025-10-03 09:04:23.764280859 +0000 UTC m=+7697.191531594" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.773021 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6687d64989-wxhgg" podStartSLOduration=2.6855111579999997 podStartE2EDuration="12.772999193s" podCreationTimestamp="2025-10-03 09:04:11 +0000 UTC" firstStartedPulling="2025-10-03 09:04:12.702555617 +0000 UTC m=+7686.129806352" lastFinishedPulling="2025-10-03 09:04:22.790024102 +0000 UTC m=+7696.217294387" observedRunningTime="2025-10-03 09:04:23.770856115 +0000 UTC m=+7697.198106850" watchObservedRunningTime="2025-10-03 09:04:23.772999193 +0000 UTC m=+7697.200249928" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.776212 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.777022 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.808794 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-95fbdc968-9z87d" podStartSLOduration=2.410474771 podStartE2EDuration="10.808773988s" podCreationTimestamp="2025-10-03 09:04:13 +0000 UTC" firstStartedPulling="2025-10-03 09:04:14.394950971 +0000 UTC m=+7687.822201706" lastFinishedPulling="2025-10-03 09:04:22.793250188 +0000 UTC m=+7696.220500923" observedRunningTime="2025-10-03 09:04:23.800141308 +0000 UTC m=+7697.227392063" watchObservedRunningTime="2025-10-03 09:04:23.808773988 +0000 UTC m=+7697.236024713" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.894783 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:23 crc kubenswrapper[4810]: I1003 09:04:23.897029 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.736729 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerStarted","Data":"787a3821a90f8b366d38035e7db82d3692844c42b7d2bf85811ba83389e1173a"} Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.737113 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerStarted","Data":"8e41db6129ebb440e14b354aafb38602fc26eb206790f69566dcdbdb50d588a8"} Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.738461 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerStarted","Data":"67b54925633fb7897de4f9db6a5d84178f3f191ee9b36f058f8d0c7e2ef38c22"} Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.738508 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerStarted","Data":"e5de03c1850b861978cbefa9ed2154f1aa86748c6ffcdac9186479f812fe22bd"} Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.760223 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.760200667 podStartE2EDuration="8.760200667s" podCreationTimestamp="2025-10-03 09:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:04:24.752881471 +0000 UTC m=+7698.180132226" watchObservedRunningTime="2025-10-03 09:04:24.760200667 +0000 UTC m=+7698.187451402" Oct 03 09:04:24 crc kubenswrapper[4810]: I1003 09:04:24.777427 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.777403656 podStartE2EDuration="8.777403656s" podCreationTimestamp="2025-10-03 09:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:04:24.774241702 +0000 UTC m=+7698.201492467" watchObservedRunningTime="2025-10-03 09:04:24.777403656 +0000 UTC m=+7698.204654391" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.031942 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.032328 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.071777 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.075218 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.076301 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.094507 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.119263 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.129183 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.764512 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.765072 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.765103 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:27 crc kubenswrapper[4810]: I1003 09:04:27.765120 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 09:04:31 crc kubenswrapper[4810]: I1003 09:04:31.526332 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:31 crc kubenswrapper[4810]: I1003 09:04:31.619191 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 09:04:31 crc kubenswrapper[4810]: I1003 09:04:31.619615 4810 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:04:31 crc kubenswrapper[4810]: I1003 09:04:31.620487 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 09:04:31 crc kubenswrapper[4810]: I1003 09:04:31.980956 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 09:04:32 crc kubenswrapper[4810]: I1003 09:04:32.072211 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:32 crc kubenswrapper[4810]: I1003 09:04:32.089222 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:32 crc kubenswrapper[4810]: I1003 09:04:32.089321 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:32 crc kubenswrapper[4810]: I1003 09:04:32.197039 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:33 crc kubenswrapper[4810]: I1003 09:04:33.780839 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Oct 03 09:04:33 crc kubenswrapper[4810]: I1003 09:04:33.896952 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.117:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8443: connect: connection refused" Oct 03 09:04:43 crc kubenswrapper[4810]: I1003 09:04:43.777632 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Oct 03 09:04:43 crc kubenswrapper[4810]: I1003 09:04:43.895867 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.117:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8443: connect: connection refused" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.097993 4810 generic.go:334] "Generic (PLEG): container finished" podID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerID="66ef4d4e7ae57b0ef21606dd16dd6adf54e3e40157325be2f254760143a70fbc" exitCode=137 Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.099255 4810 generic.go:334] "Generic (PLEG): container finished" podID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerID="1629dc12af9c0f2e01990d353d4f360f13469e2efdb1336029e6ff6a8a1f8e92" exitCode=137 Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.098093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerDied","Data":"66ef4d4e7ae57b0ef21606dd16dd6adf54e3e40157325be2f254760143a70fbc"} Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.099344 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerDied","Data":"1629dc12af9c0f2e01990d353d4f360f13469e2efdb1336029e6ff6a8a1f8e92"} Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.104913 4810 generic.go:334] "Generic (PLEG): container finished" podID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerID="092f7819a3a6983512c43db948fac849191287700f24b1a0363b14dfa88ce8b3" exitCode=137 Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.104962 4810 generic.go:334] "Generic (PLEG): container finished" podID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerID="fae848c877c87a4971d7b95f2f185c27a248656f5d40251c38e2da8af65fe518" exitCode=137 Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.104962 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerDied","Data":"092f7819a3a6983512c43db948fac849191287700f24b1a0363b14dfa88ce8b3"} Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.105049 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerDied","Data":"fae848c877c87a4971d7b95f2f185c27a248656f5d40251c38e2da8af65fe518"} Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.238932 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.291563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhtlr\" (UniqueName: \"kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr\") pod \"394bf15b-cd2f-4486-b716-bd07838fbd94\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.291657 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs\") pod \"394bf15b-cd2f-4486-b716-bd07838fbd94\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.291719 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts\") pod \"394bf15b-cd2f-4486-b716-bd07838fbd94\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.291754 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data\") pod \"394bf15b-cd2f-4486-b716-bd07838fbd94\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.291963 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key\") pod \"394bf15b-cd2f-4486-b716-bd07838fbd94\" (UID: \"394bf15b-cd2f-4486-b716-bd07838fbd94\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.294289 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs" (OuterVolumeSpecName: "logs") pod "394bf15b-cd2f-4486-b716-bd07838fbd94" (UID: "394bf15b-cd2f-4486-b716-bd07838fbd94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.299584 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr" (OuterVolumeSpecName: "kube-api-access-xhtlr") pod "394bf15b-cd2f-4486-b716-bd07838fbd94" (UID: "394bf15b-cd2f-4486-b716-bd07838fbd94"). InnerVolumeSpecName "kube-api-access-xhtlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.310523 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "394bf15b-cd2f-4486-b716-bd07838fbd94" (UID: "394bf15b-cd2f-4486-b716-bd07838fbd94"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.333306 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts" (OuterVolumeSpecName: "scripts") pod "394bf15b-cd2f-4486-b716-bd07838fbd94" (UID: "394bf15b-cd2f-4486-b716-bd07838fbd94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.333046 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data" (OuterVolumeSpecName: "config-data") pod "394bf15b-cd2f-4486-b716-bd07838fbd94" (UID: "394bf15b-cd2f-4486-b716-bd07838fbd94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.396061 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/394bf15b-cd2f-4486-b716-bd07838fbd94-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.396093 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhtlr\" (UniqueName: \"kubernetes.io/projected/394bf15b-cd2f-4486-b716-bd07838fbd94-kube-api-access-xhtlr\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.396104 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394bf15b-cd2f-4486-b716-bd07838fbd94-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.396114 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.396125 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/394bf15b-cd2f-4486-b716-bd07838fbd94-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.522446 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.599682 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key\") pod \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.599741 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jvhk\" (UniqueName: \"kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk\") pod \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.599820 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts\") pod \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.599872 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs\") pod \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.599994 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data\") pod \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\" (UID: \"18a646bf-18a8-4cab-bc2e-9827425cc2ea\") " Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.600847 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs" (OuterVolumeSpecName: "logs") pod "18a646bf-18a8-4cab-bc2e-9827425cc2ea" (UID: "18a646bf-18a8-4cab-bc2e-9827425cc2ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.603204 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk" (OuterVolumeSpecName: "kube-api-access-2jvhk") pod "18a646bf-18a8-4cab-bc2e-9827425cc2ea" (UID: "18a646bf-18a8-4cab-bc2e-9827425cc2ea"). InnerVolumeSpecName "kube-api-access-2jvhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.604614 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "18a646bf-18a8-4cab-bc2e-9827425cc2ea" (UID: "18a646bf-18a8-4cab-bc2e-9827425cc2ea"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.625548 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts" (OuterVolumeSpecName: "scripts") pod "18a646bf-18a8-4cab-bc2e-9827425cc2ea" (UID: "18a646bf-18a8-4cab-bc2e-9827425cc2ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.631314 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data" (OuterVolumeSpecName: "config-data") pod "18a646bf-18a8-4cab-bc2e-9827425cc2ea" (UID: "18a646bf-18a8-4cab-bc2e-9827425cc2ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.701968 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.702001 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/18a646bf-18a8-4cab-bc2e-9827425cc2ea-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.702013 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jvhk\" (UniqueName: \"kubernetes.io/projected/18a646bf-18a8-4cab-bc2e-9827425cc2ea-kube-api-access-2jvhk\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.702022 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18a646bf-18a8-4cab-bc2e-9827425cc2ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:54 crc kubenswrapper[4810]: I1003 09:04:54.702029 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18a646bf-18a8-4cab-bc2e-9827425cc2ea-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.117430 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76bdc5d745-bxfkf" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.117427 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76bdc5d745-bxfkf" event={"ID":"18a646bf-18a8-4cab-bc2e-9827425cc2ea","Type":"ContainerDied","Data":"d751683e370625d3f5d5689997632817bb3d46399fc4e91e473172a4f5c392ea"} Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.117841 4810 scope.go:117] "RemoveContainer" containerID="66ef4d4e7ae57b0ef21606dd16dd6adf54e3e40157325be2f254760143a70fbc" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.121006 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6687d64989-wxhgg" event={"ID":"394bf15b-cd2f-4486-b716-bd07838fbd94","Type":"ContainerDied","Data":"8359bafd7d36f65f1b80b2570e99a9df73c465500cc1e46cbf47225741900e10"} Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.121249 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6687d64989-wxhgg" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.153144 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.168618 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76bdc5d745-bxfkf"] Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.181521 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.191160 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6687d64989-wxhgg"] Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.303814 4810 scope.go:117] "RemoveContainer" containerID="1629dc12af9c0f2e01990d353d4f360f13469e2efdb1336029e6ff6a8a1f8e92" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.315249 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" path="/var/lib/kubelet/pods/18a646bf-18a8-4cab-bc2e-9827425cc2ea/volumes" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.317947 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" path="/var/lib/kubelet/pods/394bf15b-cd2f-4486-b716-bd07838fbd94/volumes" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.324752 4810 scope.go:117] "RemoveContainer" containerID="092f7819a3a6983512c43db948fac849191287700f24b1a0363b14dfa88ce8b3" Oct 03 09:04:55 crc kubenswrapper[4810]: I1003 09:04:55.496170 4810 scope.go:117] "RemoveContainer" containerID="fae848c877c87a4971d7b95f2f185c27a248656f5d40251c38e2da8af65fe518" Oct 03 09:04:56 crc kubenswrapper[4810]: I1003 09:04:56.337645 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:56 crc kubenswrapper[4810]: I1003 09:04:56.346915 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:58 crc kubenswrapper[4810]: I1003 09:04:58.127042 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:04:58 crc kubenswrapper[4810]: I1003 09:04:58.191845 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:04:58 crc kubenswrapper[4810]: I1003 09:04:58.216181 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:04:59 crc kubenswrapper[4810]: I1003 09:04:59.164964 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon-log" containerID="cri-o://b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5" gracePeriod=30 Oct 03 09:04:59 crc kubenswrapper[4810]: I1003 09:04:59.165044 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" containerID="cri-o://bb311d589e30cb8df304e92dd98f6340e3b06f32cce70a6cf9a62c55160415fa" gracePeriod=30 Oct 03 09:05:02 crc kubenswrapper[4810]: I1003 09:05:02.047171 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-w8s7r"] Oct 03 09:05:02 crc kubenswrapper[4810]: I1003 09:05:02.059957 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-w8s7r"] Oct 03 09:05:02 crc kubenswrapper[4810]: I1003 09:05:02.088816 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:05:02 crc kubenswrapper[4810]: I1003 09:05:02.088941 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:05:03 crc kubenswrapper[4810]: I1003 09:05:03.211795 4810 generic.go:334] "Generic (PLEG): container finished" podID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerID="bb311d589e30cb8df304e92dd98f6340e3b06f32cce70a6cf9a62c55160415fa" exitCode=0 Oct 03 09:05:03 crc kubenswrapper[4810]: I1003 09:05:03.213032 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerDied","Data":"bb311d589e30cb8df304e92dd98f6340e3b06f32cce70a6cf9a62c55160415fa"} Oct 03 09:05:03 crc kubenswrapper[4810]: I1003 09:05:03.315850 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="befb7b90-32fb-45a4-80e8-6e2540bd458a" path="/var/lib/kubelet/pods/befb7b90-32fb-45a4-80e8-6e2540bd458a/volumes" Oct 03 09:05:03 crc kubenswrapper[4810]: I1003 09:05:03.777931 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Oct 03 09:05:13 crc kubenswrapper[4810]: I1003 09:05:13.031829 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f221-account-create-c4l9d"] Oct 03 09:05:13 crc kubenswrapper[4810]: I1003 09:05:13.040055 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f221-account-create-c4l9d"] Oct 03 09:05:13 crc kubenswrapper[4810]: I1003 09:05:13.314732 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42c70db-9189-44ad-8658-f6632d99565a" path="/var/lib/kubelet/pods/f42c70db-9189-44ad-8658-f6632d99565a/volumes" Oct 03 09:05:13 crc kubenswrapper[4810]: I1003 09:05:13.777160 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.383308 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:05:20 crc kubenswrapper[4810]: E1003 09:05:20.384538 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.384556 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: E1003 09:05:20.384572 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.384580 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: E1003 09:05:20.384595 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.384603 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: E1003 09:05:20.384620 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.384628 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.384875 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.385359 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.385377 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a646bf-18a8-4cab-bc2e-9827425cc2ea" containerName="horizon" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.385389 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="394bf15b-cd2f-4486-b716-bd07838fbd94" containerName="horizon-log" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.386825 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.401684 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.574959 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzft\" (UniqueName: \"kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.575091 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.575134 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.575183 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.575394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.575467 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.576211 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678349 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzft\" (UniqueName: \"kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678402 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678434 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678472 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678489 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678503 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.678611 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.679356 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.680774 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.682024 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.687168 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.690806 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.692772 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.702318 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzft\" (UniqueName: \"kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft\") pod \"horizon-6c58b6647b-ttkrv\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:20 crc kubenswrapper[4810]: I1003 09:05:20.712195 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.030240 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fkzrg"] Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.043863 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fkzrg"] Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.196041 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.319763 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d233c1b-8472-419e-9d71-ab0cf11c2605" path="/var/lib/kubelet/pods/8d233c1b-8472-419e-9d71-ab0cf11c2605/volumes" Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.425711 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerStarted","Data":"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409"} Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.426460 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerStarted","Data":"15f1e51810efa2f0f3527e2a0cfc1ae889104a2578b799f81ee99c31f32ccf48"} Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.948672 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-ssk6q"] Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.950373 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:21 crc kubenswrapper[4810]: I1003 09:05:21.963333 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ssk6q"] Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.115163 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h8lf\" (UniqueName: \"kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf\") pod \"heat-db-create-ssk6q\" (UID: \"2a8c1d63-ceef-405e-8e0c-ac022d824c44\") " pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.217413 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h8lf\" (UniqueName: \"kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf\") pod \"heat-db-create-ssk6q\" (UID: \"2a8c1d63-ceef-405e-8e0c-ac022d824c44\") " pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.237479 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h8lf\" (UniqueName: \"kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf\") pod \"heat-db-create-ssk6q\" (UID: \"2a8c1d63-ceef-405e-8e0c-ac022d824c44\") " pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.273460 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.448883 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerStarted","Data":"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d"} Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.476857 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c58b6647b-ttkrv" podStartSLOduration=2.4768248330000002 podStartE2EDuration="2.476824833s" podCreationTimestamp="2025-10-03 09:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:22.468045067 +0000 UTC m=+7755.895295842" watchObservedRunningTime="2025-10-03 09:05:22.476824833 +0000 UTC m=+7755.904075568" Oct 03 09:05:22 crc kubenswrapper[4810]: I1003 09:05:22.815173 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-ssk6q"] Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.004831 4810 scope.go:117] "RemoveContainer" containerID="910c4507038139ef51066fd7772205b3f10067f87baedb27075b1d39019cb2fc" Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.057884 4810 scope.go:117] "RemoveContainer" containerID="316345a114ef9ee66df79c88ba8023061810e416879575b9361e7e39a678b4ab" Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.112630 4810 scope.go:117] "RemoveContainer" containerID="649fb516ffcce0ea8f4cc74f8497a92141ca39c45305d503c5ed5a54b9287c1c" Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.459283 4810 generic.go:334] "Generic (PLEG): container finished" podID="2a8c1d63-ceef-405e-8e0c-ac022d824c44" containerID="5f638f1e284830d337b62e2a6e0e41447ef9f5c27392c52ae056d6b9e2684516" exitCode=0 Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.460752 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ssk6q" event={"ID":"2a8c1d63-ceef-405e-8e0c-ac022d824c44","Type":"ContainerDied","Data":"5f638f1e284830d337b62e2a6e0e41447ef9f5c27392c52ae056d6b9e2684516"} Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.460784 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ssk6q" event={"ID":"2a8c1d63-ceef-405e-8e0c-ac022d824c44","Type":"ContainerStarted","Data":"d6888f57d2a3eb00449901a8bd00f075af653b765a9a0a8cf731f08b1ba11453"} Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.777478 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c76f5b976-p66q9" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.116:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8443: connect: connection refused" Oct 03 09:05:23 crc kubenswrapper[4810]: I1003 09:05:23.777886 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:05:24 crc kubenswrapper[4810]: I1003 09:05:24.827946 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:24 crc kubenswrapper[4810]: I1003 09:05:24.918156 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h8lf\" (UniqueName: \"kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf\") pod \"2a8c1d63-ceef-405e-8e0c-ac022d824c44\" (UID: \"2a8c1d63-ceef-405e-8e0c-ac022d824c44\") " Oct 03 09:05:24 crc kubenswrapper[4810]: I1003 09:05:24.923942 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf" (OuterVolumeSpecName: "kube-api-access-6h8lf") pod "2a8c1d63-ceef-405e-8e0c-ac022d824c44" (UID: "2a8c1d63-ceef-405e-8e0c-ac022d824c44"). InnerVolumeSpecName "kube-api-access-6h8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:25 crc kubenswrapper[4810]: I1003 09:05:25.019716 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h8lf\" (UniqueName: \"kubernetes.io/projected/2a8c1d63-ceef-405e-8e0c-ac022d824c44-kube-api-access-6h8lf\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:25 crc kubenswrapper[4810]: I1003 09:05:25.480107 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-ssk6q" event={"ID":"2a8c1d63-ceef-405e-8e0c-ac022d824c44","Type":"ContainerDied","Data":"d6888f57d2a3eb00449901a8bd00f075af653b765a9a0a8cf731f08b1ba11453"} Oct 03 09:05:25 crc kubenswrapper[4810]: I1003 09:05:25.480163 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6888f57d2a3eb00449901a8bd00f075af653b765a9a0a8cf731f08b1ba11453" Oct 03 09:05:25 crc kubenswrapper[4810]: I1003 09:05:25.480234 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-ssk6q" Oct 03 09:05:29 crc kubenswrapper[4810]: E1003 09:05:29.460645 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8c1d63_ceef_405e_8e0c_ac022d824c44.slice/crio-5f638f1e284830d337b62e2a6e0e41447ef9f5c27392c52ae056d6b9e2684516.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8c1d63_ceef_405e_8e0c_ac022d824c44.slice/crio-d6888f57d2a3eb00449901a8bd00f075af653b765a9a0a8cf731f08b1ba11453\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f271d78_0084_43b6_9e2b_f8f19b515f70.slice/crio-conmon-b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f271d78_0084_43b6_9e2b_f8f19b515f70.slice/crio-b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8c1d63_ceef_405e_8e0c_ac022d824c44.slice/crio-conmon-5f638f1e284830d337b62e2a6e0e41447ef9f5c27392c52ae056d6b9e2684516.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8c1d63_ceef_405e_8e0c_ac022d824c44.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.516700 4810 generic.go:334] "Generic (PLEG): container finished" podID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerID="b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5" exitCode=137 Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.517054 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerDied","Data":"b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5"} Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.670125 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.814017 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815116 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815158 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6f9z\" (UniqueName: \"kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815231 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815276 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815361 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.815391 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data\") pod \"6f271d78-0084-43b6-9e2b-f8f19b515f70\" (UID: \"6f271d78-0084-43b6-9e2b-f8f19b515f70\") " Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.816352 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs" (OuterVolumeSpecName: "logs") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.832185 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z" (OuterVolumeSpecName: "kube-api-access-c6f9z") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "kube-api-access-c6f9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.832314 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.843722 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts" (OuterVolumeSpecName: "scripts") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.848801 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.865176 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data" (OuterVolumeSpecName: "config-data") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.874761 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "6f271d78-0084-43b6-9e2b-f8f19b515f70" (UID: "6f271d78-0084-43b6-9e2b-f8f19b515f70"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918179 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918211 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918221 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f271d78-0084-43b6-9e2b-f8f19b515f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918231 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918241 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f271d78-0084-43b6-9e2b-f8f19b515f70-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918250 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f271d78-0084-43b6-9e2b-f8f19b515f70-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:29 crc kubenswrapper[4810]: I1003 09:05:29.918260 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6f9z\" (UniqueName: \"kubernetes.io/projected/6f271d78-0084-43b6-9e2b-f8f19b515f70-kube-api-access-c6f9z\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.528157 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c76f5b976-p66q9" event={"ID":"6f271d78-0084-43b6-9e2b-f8f19b515f70","Type":"ContainerDied","Data":"0e9351d15111606ff8d6a8c9ea0944363add4ccc353058884b403a7f8badf32d"} Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.528539 4810 scope.go:117] "RemoveContainer" containerID="bb311d589e30cb8df304e92dd98f6340e3b06f32cce70a6cf9a62c55160415fa" Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.528291 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c76f5b976-p66q9" Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.568047 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.595610 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c76f5b976-p66q9"] Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.713940 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.714033 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:30 crc kubenswrapper[4810]: I1003 09:05:30.722529 4810 scope.go:117] "RemoveContainer" containerID="b00c63e2cc01a1a38bd5110b515d1da4c3ef20b48a59b89ee7184802b32c3cd5" Oct 03 09:05:31 crc kubenswrapper[4810]: I1003 09:05:31.326174 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" path="/var/lib/kubelet/pods/6f271d78-0084-43b6-9e2b-f8f19b515f70/volumes" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.020857 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-d134-account-create-xcc2k"] Oct 03 09:05:32 crc kubenswrapper[4810]: E1003 09:05:32.021767 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon-log" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.021788 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon-log" Oct 03 09:05:32 crc kubenswrapper[4810]: E1003 09:05:32.021815 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c1d63-ceef-405e-8e0c-ac022d824c44" containerName="mariadb-database-create" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.021826 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c1d63-ceef-405e-8e0c-ac022d824c44" containerName="mariadb-database-create" Oct 03 09:05:32 crc kubenswrapper[4810]: E1003 09:05:32.021864 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.021872 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.022200 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8c1d63-ceef-405e-8e0c-ac022d824c44" containerName="mariadb-database-create" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.022236 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.022270 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f271d78-0084-43b6-9e2b-f8f19b515f70" containerName="horizon-log" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.023262 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.026304 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.033706 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d134-account-create-xcc2k"] Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.089156 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.089220 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.089288 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.090335 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.090438 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" gracePeriod=600 Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.174798 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m65dk\" (UniqueName: \"kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk\") pod \"heat-d134-account-create-xcc2k\" (UID: \"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d\") " pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:32 crc kubenswrapper[4810]: E1003 09:05:32.237600 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.277458 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m65dk\" (UniqueName: \"kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk\") pod \"heat-d134-account-create-xcc2k\" (UID: \"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d\") " pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.300660 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m65dk\" (UniqueName: \"kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk\") pod \"heat-d134-account-create-xcc2k\" (UID: \"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d\") " pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.355212 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.555325 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" exitCode=0 Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.555383 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da"} Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.555428 4810 scope.go:117] "RemoveContainer" containerID="36cfab4838df95df84fbbb0b6e408e16388ef60506d996263f6231941a6c6973" Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.556207 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:05:32 crc kubenswrapper[4810]: E1003 09:05:32.556589 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:05:32 crc kubenswrapper[4810]: W1003 09:05:32.806090 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e300415_0c77_4c7d_a35b_5bc2b5e6a01d.slice/crio-567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8 WatchSource:0}: Error finding container 567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8: Status 404 returned error can't find the container with id 567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8 Oct 03 09:05:32 crc kubenswrapper[4810]: I1003 09:05:32.806708 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-d134-account-create-xcc2k"] Oct 03 09:05:33 crc kubenswrapper[4810]: I1003 09:05:33.568867 4810 generic.go:334] "Generic (PLEG): container finished" podID="5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" containerID="d5561a2dc144a26ef50427c1350893c3889c01df11510ef51f59866fb9c09a65" exitCode=0 Oct 03 09:05:33 crc kubenswrapper[4810]: I1003 09:05:33.568951 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d134-account-create-xcc2k" event={"ID":"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d","Type":"ContainerDied","Data":"d5561a2dc144a26ef50427c1350893c3889c01df11510ef51f59866fb9c09a65"} Oct 03 09:05:33 crc kubenswrapper[4810]: I1003 09:05:33.569218 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d134-account-create-xcc2k" event={"ID":"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d","Type":"ContainerStarted","Data":"567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8"} Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.013139 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.155391 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m65dk\" (UniqueName: \"kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk\") pod \"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d\" (UID: \"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d\") " Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.164576 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk" (OuterVolumeSpecName: "kube-api-access-m65dk") pod "5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" (UID: "5e300415-0c77-4c7d-a35b-5bc2b5e6a01d"). InnerVolumeSpecName "kube-api-access-m65dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.259082 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m65dk\" (UniqueName: \"kubernetes.io/projected/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d-kube-api-access-m65dk\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.595410 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-d134-account-create-xcc2k" event={"ID":"5e300415-0c77-4c7d-a35b-5bc2b5e6a01d","Type":"ContainerDied","Data":"567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8"} Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.595473 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="567154b9109b0c2deed65cb110aa4221aabff85e54d7ad5749c8afe4d1c6e1d8" Oct 03 09:05:35 crc kubenswrapper[4810]: I1003 09:05:35.595537 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-d134-account-create-xcc2k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.065971 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-7474k"] Oct 03 09:05:37 crc kubenswrapper[4810]: E1003 09:05:37.066712 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" containerName="mariadb-account-create" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.066727 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" containerName="mariadb-account-create" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.066940 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" containerName="mariadb-account-create" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.067675 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.070332 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9rx4l" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.070612 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.077554 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-7474k"] Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.202803 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.202993 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.203064 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsnmf\" (UniqueName: \"kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.304642 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.305544 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.305636 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsnmf\" (UniqueName: \"kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.313970 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.315765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.327644 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsnmf\" (UniqueName: \"kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf\") pod \"heat-db-sync-7474k\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.399598 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-7474k" Oct 03 09:05:37 crc kubenswrapper[4810]: I1003 09:05:37.935494 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-7474k"] Oct 03 09:05:38 crc kubenswrapper[4810]: I1003 09:05:38.631935 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-7474k" event={"ID":"9b28102f-d4e0-4348-8ab4-98be581908e2","Type":"ContainerStarted","Data":"42588844e8e2727c09d8aab527e6bffc92a107358a149538559e0e37bdb1deac"} Oct 03 09:05:40 crc kubenswrapper[4810]: I1003 09:05:40.719595 4810 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Oct 03 09:05:45 crc kubenswrapper[4810]: I1003 09:05:45.303269 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:05:45 crc kubenswrapper[4810]: E1003 09:05:45.304413 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:05:50 crc kubenswrapper[4810]: I1003 09:05:50.758560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-7474k" event={"ID":"9b28102f-d4e0-4348-8ab4-98be581908e2","Type":"ContainerStarted","Data":"3131e01fff21547ea79afbfa86fce6c8d593859c7204fdc10be84801049c36f2"} Oct 03 09:05:50 crc kubenswrapper[4810]: I1003 09:05:50.776319 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-7474k" podStartSLOduration=2.23492388 podStartE2EDuration="13.776289396s" podCreationTimestamp="2025-10-03 09:05:37 +0000 UTC" firstStartedPulling="2025-10-03 09:05:37.946987681 +0000 UTC m=+7771.374238416" lastFinishedPulling="2025-10-03 09:05:49.488353197 +0000 UTC m=+7782.915603932" observedRunningTime="2025-10-03 09:05:50.773254344 +0000 UTC m=+7784.200505089" watchObservedRunningTime="2025-10-03 09:05:50.776289396 +0000 UTC m=+7784.203540141" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.627087 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.629809 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.643314 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.740846 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.740976 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.741000 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnflp\" (UniqueName: \"kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.843587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.843675 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.843709 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnflp\" (UniqueName: \"kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.844693 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.845033 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.863932 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnflp\" (UniqueName: \"kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp\") pod \"redhat-marketplace-h2bml\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:51 crc kubenswrapper[4810]: I1003 09:05:51.951221 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:05:52 crc kubenswrapper[4810]: W1003 09:05:52.424623 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c4fc0ab_8a40_4a51_85a3_ad5f496701eb.slice/crio-4c536b2e4b9dc0da84e0a4a4cf974ab9686040d67e6050d9f28479e06d55398a WatchSource:0}: Error finding container 4c536b2e4b9dc0da84e0a4a4cf974ab9686040d67e6050d9f28479e06d55398a: Status 404 returned error can't find the container with id 4c536b2e4b9dc0da84e0a4a4cf974ab9686040d67e6050d9f28479e06d55398a Oct 03 09:05:52 crc kubenswrapper[4810]: I1003 09:05:52.430440 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:05:52 crc kubenswrapper[4810]: I1003 09:05:52.788536 4810 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerID="4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226" exitCode=0 Oct 03 09:05:52 crc kubenswrapper[4810]: I1003 09:05:52.788788 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerDied","Data":"4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226"} Oct 03 09:05:52 crc kubenswrapper[4810]: I1003 09:05:52.789294 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerStarted","Data":"4c536b2e4b9dc0da84e0a4a4cf974ab9686040d67e6050d9f28479e06d55398a"} Oct 03 09:05:52 crc kubenswrapper[4810]: I1003 09:05:52.822670 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:53 crc kubenswrapper[4810]: I1003 09:05:53.831231 4810 generic.go:334] "Generic (PLEG): container finished" podID="9b28102f-d4e0-4348-8ab4-98be581908e2" containerID="3131e01fff21547ea79afbfa86fce6c8d593859c7204fdc10be84801049c36f2" exitCode=0 Oct 03 09:05:53 crc kubenswrapper[4810]: I1003 09:05:53.831245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-7474k" event={"ID":"9b28102f-d4e0-4348-8ab4-98be581908e2","Type":"ContainerDied","Data":"3131e01fff21547ea79afbfa86fce6c8d593859c7204fdc10be84801049c36f2"} Oct 03 09:05:53 crc kubenswrapper[4810]: I1003 09:05:53.834813 4810 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerID="56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad" exitCode=0 Oct 03 09:05:53 crc kubenswrapper[4810]: I1003 09:05:53.834875 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerDied","Data":"56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad"} Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.638926 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.724774 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.725077 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon-log" containerID="cri-o://2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.725334 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" containerID="cri-o://a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c" gracePeriod=30 Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.852850 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerStarted","Data":"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728"} Oct 03 09:05:54 crc kubenswrapper[4810]: I1003 09:05:54.876710 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h2bml" podStartSLOduration=2.3745488 podStartE2EDuration="3.876690751s" podCreationTimestamp="2025-10-03 09:05:51 +0000 UTC" firstStartedPulling="2025-10-03 09:05:52.794427002 +0000 UTC m=+7786.221677757" lastFinishedPulling="2025-10-03 09:05:54.296568973 +0000 UTC m=+7787.723819708" observedRunningTime="2025-10-03 09:05:54.871078171 +0000 UTC m=+7788.298328906" watchObservedRunningTime="2025-10-03 09:05:54.876690751 +0000 UTC m=+7788.303941486" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.238031 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-7474k" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.339829 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data\") pod \"9b28102f-d4e0-4348-8ab4-98be581908e2\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.340527 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsnmf\" (UniqueName: \"kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf\") pod \"9b28102f-d4e0-4348-8ab4-98be581908e2\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.340577 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle\") pod \"9b28102f-d4e0-4348-8ab4-98be581908e2\" (UID: \"9b28102f-d4e0-4348-8ab4-98be581908e2\") " Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.363248 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf" (OuterVolumeSpecName: "kube-api-access-xsnmf") pod "9b28102f-d4e0-4348-8ab4-98be581908e2" (UID: "9b28102f-d4e0-4348-8ab4-98be581908e2"). InnerVolumeSpecName "kube-api-access-xsnmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.374959 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b28102f-d4e0-4348-8ab4-98be581908e2" (UID: "9b28102f-d4e0-4348-8ab4-98be581908e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.431525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data" (OuterVolumeSpecName: "config-data") pod "9b28102f-d4e0-4348-8ab4-98be581908e2" (UID: "9b28102f-d4e0-4348-8ab4-98be581908e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.443454 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsnmf\" (UniqueName: \"kubernetes.io/projected/9b28102f-d4e0-4348-8ab4-98be581908e2-kube-api-access-xsnmf\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.443490 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.443502 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b28102f-d4e0-4348-8ab4-98be581908e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.862196 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-7474k" Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.862187 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-7474k" event={"ID":"9b28102f-d4e0-4348-8ab4-98be581908e2","Type":"ContainerDied","Data":"42588844e8e2727c09d8aab527e6bffc92a107358a149538559e0e37bdb1deac"} Oct 03 09:05:55 crc kubenswrapper[4810]: I1003 09:05:55.862374 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42588844e8e2727c09d8aab527e6bffc92a107358a149538559e0e37bdb1deac" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.894457 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:05:56 crc kubenswrapper[4810]: E1003 09:05:56.895002 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b28102f-d4e0-4348-8ab4-98be581908e2" containerName="heat-db-sync" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.895018 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b28102f-d4e0-4348-8ab4-98be581908e2" containerName="heat-db-sync" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.895345 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b28102f-d4e0-4348-8ab4-98be581908e2" containerName="heat-db-sync" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.896270 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.899081 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.899442 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9rx4l" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.902244 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 03 09:05:56 crc kubenswrapper[4810]: I1003 09:05:56.905112 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.079545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.080018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.080306 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.080358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67qkg\" (UniqueName: \"kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.127802 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.129420 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.134625 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.144855 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.146575 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.149850 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.171289 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.186325 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.186403 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.186567 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.186595 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67qkg\" (UniqueName: \"kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.197511 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.198043 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.203289 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.221971 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.222869 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67qkg\" (UniqueName: \"kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg\") pod \"heat-engine-767579c56f-rmc2j\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.232415 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290186 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290280 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290311 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgs8c\" (UniqueName: \"kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290379 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290401 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290420 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdggz\" (UniqueName: \"kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.290519 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416203 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416289 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416328 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416520 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdggz\" (UniqueName: \"kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416597 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416665 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416783 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.416836 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgs8c\" (UniqueName: \"kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.434360 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.437384 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.442970 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.443596 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.444224 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.460915 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.475736 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgs8c\" (UniqueName: \"kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c\") pod \"heat-api-79b657474f-9khfh\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.486735 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdggz\" (UniqueName: \"kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz\") pod \"heat-cfnapi-c4766b898-w7pkh\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.751789 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.769732 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.783063 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:05:57 crc kubenswrapper[4810]: W1003 09:05:57.792073 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1977444_b7fb_4943_8079_d7c97d32a5ca.slice/crio-38646f2ee41a0cf083362fc0ea524b7b0eb5cacc6a87ffa240fab81f4a8a1feb WatchSource:0}: Error finding container 38646f2ee41a0cf083362fc0ea524b7b0eb5cacc6a87ffa240fab81f4a8a1feb: Status 404 returned error can't find the container with id 38646f2ee41a0cf083362fc0ea524b7b0eb5cacc6a87ffa240fab81f4a8a1feb Oct 03 09:05:57 crc kubenswrapper[4810]: I1003 09:05:57.897028 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-767579c56f-rmc2j" event={"ID":"e1977444-b7fb-4943-8079-d7c97d32a5ca","Type":"ContainerStarted","Data":"38646f2ee41a0cf083362fc0ea524b7b0eb5cacc6a87ffa240fab81f4a8a1feb"} Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.319211 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.328160 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:05:58 crc kubenswrapper[4810]: W1003 09:05:58.330085 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda43dd7fd_90e2_4dfe_83de_1094e1b24a93.slice/crio-578ed9fd80915ae8310641f10cc49d1ad485ed9cd536d93f4c6e7a89d2222863 WatchSource:0}: Error finding container 578ed9fd80915ae8310641f10cc49d1ad485ed9cd536d93f4c6e7a89d2222863: Status 404 returned error can't find the container with id 578ed9fd80915ae8310641f10cc49d1ad485ed9cd536d93f4c6e7a89d2222863 Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.907341 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c4766b898-w7pkh" event={"ID":"18e853cb-f172-4ea6-b59b-cf75610b3eba","Type":"ContainerStarted","Data":"3682af873143dd810ad8d1b860d6859ee508b75adf7710c6cb159f475dc75111"} Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.910516 4810 generic.go:334] "Generic (PLEG): container finished" podID="7ac00941-8b00-47af-9183-b7578731f8eb" containerID="a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c" exitCode=0 Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.910591 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerDied","Data":"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c"} Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.912429 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-767579c56f-rmc2j" event={"ID":"e1977444-b7fb-4943-8079-d7c97d32a5ca","Type":"ContainerStarted","Data":"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec"} Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.912568 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.916197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b657474f-9khfh" event={"ID":"a43dd7fd-90e2-4dfe-83de-1094e1b24a93","Type":"ContainerStarted","Data":"578ed9fd80915ae8310641f10cc49d1ad485ed9cd536d93f4c6e7a89d2222863"} Oct 03 09:05:58 crc kubenswrapper[4810]: I1003 09:05:58.938158 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-767579c56f-rmc2j" podStartSLOduration=2.938136886 podStartE2EDuration="2.938136886s" podCreationTimestamp="2025-10-03 09:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:05:58.929596398 +0000 UTC m=+7792.356847133" watchObservedRunningTime="2025-10-03 09:05:58.938136886 +0000 UTC m=+7792.365387631" Oct 03 09:06:00 crc kubenswrapper[4810]: I1003 09:06:00.311110 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:06:00 crc kubenswrapper[4810]: E1003 09:06:00.312113 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:06:00 crc kubenswrapper[4810]: I1003 09:06:00.975733 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b657474f-9khfh" event={"ID":"a43dd7fd-90e2-4dfe-83de-1094e1b24a93","Type":"ContainerStarted","Data":"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de"} Oct 03 09:06:00 crc kubenswrapper[4810]: I1003 09:06:00.975994 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:06:00 crc kubenswrapper[4810]: I1003 09:06:00.987918 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c4766b898-w7pkh" event={"ID":"18e853cb-f172-4ea6-b59b-cf75610b3eba","Type":"ContainerStarted","Data":"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5"} Oct 03 09:06:00 crc kubenswrapper[4810]: I1003 09:06:00.988472 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:06:01 crc kubenswrapper[4810]: I1003 09:06:01.002418 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-79b657474f-9khfh" podStartSLOduration=2.242063736 podStartE2EDuration="4.002395104s" podCreationTimestamp="2025-10-03 09:05:57 +0000 UTC" firstStartedPulling="2025-10-03 09:05:58.333686247 +0000 UTC m=+7791.760936982" lastFinishedPulling="2025-10-03 09:06:00.094017615 +0000 UTC m=+7793.521268350" observedRunningTime="2025-10-03 09:06:00.997601506 +0000 UTC m=+7794.424852241" watchObservedRunningTime="2025-10-03 09:06:01.002395104 +0000 UTC m=+7794.429645839" Oct 03 09:06:01 crc kubenswrapper[4810]: I1003 09:06:01.037286 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-c4766b898-w7pkh" podStartSLOduration=2.298412321 podStartE2EDuration="4.037266505s" podCreationTimestamp="2025-10-03 09:05:57 +0000 UTC" firstStartedPulling="2025-10-03 09:05:58.352632774 +0000 UTC m=+7791.779883509" lastFinishedPulling="2025-10-03 09:06:00.091486958 +0000 UTC m=+7793.518737693" observedRunningTime="2025-10-03 09:06:01.021483304 +0000 UTC m=+7794.448734039" watchObservedRunningTime="2025-10-03 09:06:01.037266505 +0000 UTC m=+7794.464517240" Oct 03 09:06:01 crc kubenswrapper[4810]: I1003 09:06:01.951652 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:01 crc kubenswrapper[4810]: I1003 09:06:01.951929 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:02 crc kubenswrapper[4810]: I1003 09:06:02.034633 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:02 crc kubenswrapper[4810]: I1003 09:06:02.082057 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:02 crc kubenswrapper[4810]: I1003 09:06:02.279114 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:06:03 crc kubenswrapper[4810]: I1003 09:06:03.897483 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.117:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8443: connect: connection refused" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.023186 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h2bml" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="registry-server" containerID="cri-o://8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728" gracePeriod=2 Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.265583 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.267850 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.306501 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.372330 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc6wr\" (UniqueName: \"kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.372423 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.372545 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.372581 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.372820 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.375629 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.400948 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.402811 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.433541 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.456270 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474280 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474491 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474567 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474640 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474775 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474884 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.474994 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.475061 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.475200 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc6wr\" (UniqueName: \"kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.475321 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qv4z\" (UniqueName: \"kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.475425 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.475518 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2kg\" (UniqueName: \"kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.485554 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.486457 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.498799 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc6wr\" (UniqueName: \"kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.505646 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom\") pod \"heat-engine-55547566f9-2tzph\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577252 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qv4z\" (UniqueName: \"kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577377 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2kg\" (UniqueName: \"kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577540 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577567 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577669 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.577974 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.581744 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.582937 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.584386 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.585869 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.585884 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.587764 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.598783 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qv4z\" (UniqueName: \"kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z\") pod \"heat-cfnapi-5749954ff8-49msr\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.601126 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2kg\" (UniqueName: \"kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg\") pod \"heat-api-767dcd9844-bdcsr\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.620339 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.654193 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.711458 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.731450 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.732089 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:04 crc kubenswrapper[4810]: E1003 09:06:04.732608 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="extract-content" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.732624 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="extract-content" Oct 03 09:06:04 crc kubenswrapper[4810]: E1003 09:06:04.732661 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="extract-utilities" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.732670 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="extract-utilities" Oct 03 09:06:04 crc kubenswrapper[4810]: E1003 09:06:04.732685 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="registry-server" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.732691 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="registry-server" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.732965 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerName="registry-server" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.734716 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.743764 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.786672 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnflp\" (UniqueName: \"kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp\") pod \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.787168 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content\") pod \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.787243 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities\") pod \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\" (UID: \"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb\") " Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.791796 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities" (OuterVolumeSpecName: "utilities") pod "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" (UID: "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.806850 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp" (OuterVolumeSpecName: "kube-api-access-vnflp") pod "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" (UID: "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb"). InnerVolumeSpecName "kube-api-access-vnflp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.808425 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" (UID: "8c4fc0ab-8a40-4a51-85a3-ad5f496701eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889685 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889802 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889840 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27p9q\" (UniqueName: \"kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889910 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889926 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.889935 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnflp\" (UniqueName: \"kubernetes.io/projected/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb-kube-api-access-vnflp\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.992170 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.992301 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.992339 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27p9q\" (UniqueName: \"kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.992655 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:04 crc kubenswrapper[4810]: I1003 09:06:04.992720 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.014578 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27p9q\" (UniqueName: \"kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q\") pod \"community-operators-rzp6l\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.039182 4810 generic.go:334] "Generic (PLEG): container finished" podID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" containerID="8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728" exitCode=0 Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.039236 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerDied","Data":"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728"} Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.039293 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2bml" event={"ID":"8c4fc0ab-8a40-4a51-85a3-ad5f496701eb","Type":"ContainerDied","Data":"4c536b2e4b9dc0da84e0a4a4cf974ab9686040d67e6050d9f28479e06d55398a"} Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.039317 4810 scope.go:117] "RemoveContainer" containerID="8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.039556 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2bml" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.055203 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.090608 4810 scope.go:117] "RemoveContainer" containerID="56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.101084 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.109722 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2bml"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.181337 4810 scope.go:117] "RemoveContainer" containerID="4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.244352 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:06:05 crc kubenswrapper[4810]: W1003 09:06:05.355116 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4474daf6_43cd_4602_895c_c4ca53b7b35d.slice/crio-19e6acf552cc80d7718ee7888a9c389b0b7b04ed92e8baf33f267c22be53d095 WatchSource:0}: Error finding container 19e6acf552cc80d7718ee7888a9c389b0b7b04ed92e8baf33f267c22be53d095: Status 404 returned error can't find the container with id 19e6acf552cc80d7718ee7888a9c389b0b7b04ed92e8baf33f267c22be53d095 Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.359375 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4fc0ab-8a40-4a51-85a3-ad5f496701eb" path="/var/lib/kubelet/pods/8c4fc0ab-8a40-4a51-85a3-ad5f496701eb/volumes" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.368966 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.409268 4810 scope.go:117] "RemoveContainer" containerID="8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728" Oct 03 09:06:05 crc kubenswrapper[4810]: E1003 09:06:05.410223 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728\": container with ID starting with 8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728 not found: ID does not exist" containerID="8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.410261 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728"} err="failed to get container status \"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728\": rpc error: code = NotFound desc = could not find container \"8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728\": container with ID starting with 8453f2457ecc88520bb6a5b69da1494fe8763e52b28bc13895d509edfe768728 not found: ID does not exist" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.410289 4810 scope.go:117] "RemoveContainer" containerID="56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad" Oct 03 09:06:05 crc kubenswrapper[4810]: E1003 09:06:05.410646 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad\": container with ID starting with 56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad not found: ID does not exist" containerID="56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.410674 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad"} err="failed to get container status \"56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad\": rpc error: code = NotFound desc = could not find container \"56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad\": container with ID starting with 56ac7c2ba56884aef749a41f5bfbd4734d6cc009cb01b9aaee0f7d78b7b01aad not found: ID does not exist" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.410697 4810 scope.go:117] "RemoveContainer" containerID="4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226" Oct 03 09:06:05 crc kubenswrapper[4810]: E1003 09:06:05.411413 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226\": container with ID starting with 4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226 not found: ID does not exist" containerID="4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.411442 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226"} err="failed to get container status \"4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226\": rpc error: code = NotFound desc = could not find container \"4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226\": container with ID starting with 4cb81276a4bf02f6b926ffbecec38606191c596e2be0f9807f8dc9ba9a220226 not found: ID does not exist" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.445227 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.735365 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:05 crc kubenswrapper[4810]: W1003 09:06:05.735529 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7205bc13_c57e_491b_908e_6b99171a41a5.slice/crio-af35bf9e979616c64d0afd2b7353447d52dcd54a13a2bbb3cb53961bb8768cbf WatchSource:0}: Error finding container af35bf9e979616c64d0afd2b7353447d52dcd54a13a2bbb3cb53961bb8768cbf: Status 404 returned error can't find the container with id af35bf9e979616c64d0afd2b7353447d52dcd54a13a2bbb3cb53961bb8768cbf Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.892259 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.892555 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-79b657474f-9khfh" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" containerID="cri-o://ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de" gracePeriod=60 Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.912111 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-79b657474f-9khfh" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.126:8004/healthcheck\": EOF" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.912696 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.912928 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-c4766b898-w7pkh" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" containerID="cri-o://573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5" gracePeriod=60 Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.927410 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.929208 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.930173 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-c4766b898-w7pkh" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.127:8000/healthcheck\": EOF" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.934475 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.934681 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.948933 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.961949 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.964975 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.971658 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 03 09:06:05 crc kubenswrapper[4810]: I1003 09:06:05.971835 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.004357 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.020601 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.020675 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.020848 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.021372 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.021616 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.021697 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.127877 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.127977 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcdzj\" (UniqueName: \"kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128035 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128074 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128100 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128124 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128207 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128253 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128303 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128347 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.128405 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.146168 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.147401 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.148471 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.149194 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.164087 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.173606 4810 generic.go:334] "Generic (PLEG): container finished" podID="7205bc13-c57e-491b-908e-6b99171a41a5" containerID="6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7" exitCode=0 Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.173691 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerDied","Data":"6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.173717 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerStarted","Data":"af35bf9e979616c64d0afd2b7353447d52dcd54a13a2bbb3cb53961bb8768cbf"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.181886 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") pod \"heat-api-d6cf467cf-6jvjp\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.195236 4810 generic.go:334] "Generic (PLEG): container finished" podID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerID="c696b2778e354b80d59b0d2f5c8b5ceaf921e9e01cfcc66460ac4d1e181611ee" exitCode=1 Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.195321 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-767dcd9844-bdcsr" event={"ID":"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9","Type":"ContainerDied","Data":"c696b2778e354b80d59b0d2f5c8b5ceaf921e9e01cfcc66460ac4d1e181611ee"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.195347 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-767dcd9844-bdcsr" event={"ID":"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9","Type":"ContainerStarted","Data":"48c8b09ff8654568e7089ddae06ea1d82ecf0122b3e1149690dd2a6efd130de2"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.196009 4810 scope.go:117] "RemoveContainer" containerID="c696b2778e354b80d59b0d2f5c8b5ceaf921e9e01cfcc66460ac4d1e181611ee" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229740 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229780 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229864 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcdzj\" (UniqueName: \"kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229909 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.229970 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.240188 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5749954ff8-49msr" event={"ID":"cf4d30b3-ffd4-4b08-a100-5f12e75e15df","Type":"ContainerStarted","Data":"d3af3bf5d71a0a604c6699af3d61a5d4a32eefe91f8d72400fa7dcf26d3da2ff"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.240245 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5749954ff8-49msr" event={"ID":"cf4d30b3-ffd4-4b08-a100-5f12e75e15df","Type":"ContainerStarted","Data":"d4e4d3ce8860f6265699ee51a21d93e95fe2baf00b870ae83472820a0eb15d85"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.241092 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.246968 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.247460 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.248001 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.254437 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.259765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.262931 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.274335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55547566f9-2tzph" event={"ID":"4474daf6-43cd-4602-895c-c4ca53b7b35d","Type":"ContainerStarted","Data":"f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.274388 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55547566f9-2tzph" event={"ID":"4474daf6-43cd-4602-895c-c4ca53b7b35d","Type":"ContainerStarted","Data":"19e6acf552cc80d7718ee7888a9c389b0b7b04ed92e8baf33f267c22be53d095"} Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.275445 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.293426 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcdzj\" (UniqueName: \"kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj\") pod \"heat-cfnapi-577984bbd4-45rx5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.298464 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.319102 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5749954ff8-49msr" podStartSLOduration=2.3190766529999998 podStartE2EDuration="2.319076653s" podCreationTimestamp="2025-10-03 09:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:06.292477822 +0000 UTC m=+7799.719728567" watchObservedRunningTime="2025-10-03 09:06:06.319076653 +0000 UTC m=+7799.746327388" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.904036 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-55547566f9-2tzph" podStartSLOduration=2.9040204899999997 podStartE2EDuration="2.90402049s" podCreationTimestamp="2025-10-03 09:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:06.345256693 +0000 UTC m=+7799.772507418" watchObservedRunningTime="2025-10-03 09:06:06.90402049 +0000 UTC m=+7800.331271225" Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.911585 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:06:06 crc kubenswrapper[4810]: W1003 09:06:06.917249 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd03f8837_7028_4fa8_b8e2_9e12a35037c5.slice/crio-eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c WatchSource:0}: Error finding container eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c: Status 404 returned error can't find the container with id eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c Oct 03 09:06:06 crc kubenswrapper[4810]: I1003 09:06:06.945287 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.295630 4810 generic.go:334] "Generic (PLEG): container finished" podID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerID="d3af3bf5d71a0a604c6699af3d61a5d4a32eefe91f8d72400fa7dcf26d3da2ff" exitCode=1 Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.297092 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5749954ff8-49msr" event={"ID":"cf4d30b3-ffd4-4b08-a100-5f12e75e15df","Type":"ContainerDied","Data":"d3af3bf5d71a0a604c6699af3d61a5d4a32eefe91f8d72400fa7dcf26d3da2ff"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.297976 4810 scope.go:117] "RemoveContainer" containerID="d3af3bf5d71a0a604c6699af3d61a5d4a32eefe91f8d72400fa7dcf26d3da2ff" Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.321113 4810 generic.go:334] "Generic (PLEG): container finished" podID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerID="4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673" exitCode=1 Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.344774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-767dcd9844-bdcsr" event={"ID":"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9","Type":"ContainerDied","Data":"4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.345084 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-577984bbd4-45rx5" event={"ID":"d03f8837-7028-4fa8-b8e2-9e12a35037c5","Type":"ContainerStarted","Data":"f0b610453374d95fa8d8b8835a740b4722435108e83542f0e16bb20b2adca98f"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.345335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-577984bbd4-45rx5" event={"ID":"d03f8837-7028-4fa8-b8e2-9e12a35037c5","Type":"ContainerStarted","Data":"eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.345466 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d6cf467cf-6jvjp" event={"ID":"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293","Type":"ContainerStarted","Data":"a2c2611ee5c4b7a3eb513f4e9f6d3e9dfa093b9ff621e7dcd667a7d9a57b1f3e"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.345579 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d6cf467cf-6jvjp" event={"ID":"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293","Type":"ContainerStarted","Data":"396790b96bf82f48099bdd7be3654af1eda2c75580e145bcd422441e0776cb96"} Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.345119 4810 scope.go:117] "RemoveContainer" containerID="c696b2778e354b80d59b0d2f5c8b5ceaf921e9e01cfcc66460ac4d1e181611ee" Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.365048 4810 scope.go:117] "RemoveContainer" containerID="4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673" Oct 03 09:06:07 crc kubenswrapper[4810]: E1003 09:06:07.365356 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-767dcd9844-bdcsr_openstack(ff240e38-d82c-4bd6-a1fd-1d58b2e063a9)\"" pod="openstack/heat-api-767dcd9844-bdcsr" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" Oct 03 09:06:07 crc kubenswrapper[4810]: I1003 09:06:07.456704 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-d6cf467cf-6jvjp" podStartSLOduration=2.456681925 podStartE2EDuration="2.456681925s" podCreationTimestamp="2025-10-03 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:07.440102822 +0000 UTC m=+7800.867353567" watchObservedRunningTime="2025-10-03 09:06:07.456681925 +0000 UTC m=+7800.883932660" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.355520 4810 generic.go:334] "Generic (PLEG): container finished" podID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerID="46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84" exitCode=1 Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.355674 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5749954ff8-49msr" event={"ID":"cf4d30b3-ffd4-4b08-a100-5f12e75e15df","Type":"ContainerDied","Data":"46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84"} Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.355924 4810 scope.go:117] "RemoveContainer" containerID="d3af3bf5d71a0a604c6699af3d61a5d4a32eefe91f8d72400fa7dcf26d3da2ff" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.358309 4810 scope.go:117] "RemoveContainer" containerID="46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.361736 4810 scope.go:117] "RemoveContainer" containerID="4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673" Oct 03 09:06:08 crc kubenswrapper[4810]: E1003 09:06:08.361927 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-767dcd9844-bdcsr_openstack(ff240e38-d82c-4bd6-a1fd-1d58b2e063a9)\"" pod="openstack/heat-api-767dcd9844-bdcsr" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" Oct 03 09:06:08 crc kubenswrapper[4810]: E1003 09:06:08.361926 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5749954ff8-49msr_openstack(cf4d30b3-ffd4-4b08-a100-5f12e75e15df)\"" pod="openstack/heat-cfnapi-5749954ff8-49msr" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.373814 4810 generic.go:334] "Generic (PLEG): container finished" podID="7205bc13-c57e-491b-908e-6b99171a41a5" containerID="70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56" exitCode=0 Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.375659 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerDied","Data":"70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56"} Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.375798 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.377147 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:08 crc kubenswrapper[4810]: I1003 09:06:08.394438 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-577984bbd4-45rx5" podStartSLOduration=3.394418808 podStartE2EDuration="3.394418808s" podCreationTimestamp="2025-10-03 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:06:07.476394472 +0000 UTC m=+7800.903645207" watchObservedRunningTime="2025-10-03 09:06:08.394418808 +0000 UTC m=+7801.821669543" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.386451 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerStarted","Data":"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514"} Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.389248 4810 scope.go:117] "RemoveContainer" containerID="46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84" Oct 03 09:06:09 crc kubenswrapper[4810]: E1003 09:06:09.389539 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5749954ff8-49msr_openstack(cf4d30b3-ffd4-4b08-a100-5f12e75e15df)\"" pod="openstack/heat-cfnapi-5749954ff8-49msr" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.409930 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzp6l" podStartSLOduration=2.611662851 podStartE2EDuration="5.409887457s" podCreationTimestamp="2025-10-03 09:06:04 +0000 UTC" firstStartedPulling="2025-10-03 09:06:06.182120585 +0000 UTC m=+7799.609371330" lastFinishedPulling="2025-10-03 09:06:08.980345201 +0000 UTC m=+7802.407595936" observedRunningTime="2025-10-03 09:06:09.40286257 +0000 UTC m=+7802.830113305" watchObservedRunningTime="2025-10-03 09:06:09.409887457 +0000 UTC m=+7802.837138192" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.712516 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.712592 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.732552 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.732612 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:09 crc kubenswrapper[4810]: I1003 09:06:09.733352 4810 scope.go:117] "RemoveContainer" containerID="4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673" Oct 03 09:06:09 crc kubenswrapper[4810]: E1003 09:06:09.733680 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-767dcd9844-bdcsr_openstack(ff240e38-d82c-4bd6-a1fd-1d58b2e063a9)\"" pod="openstack/heat-api-767dcd9844-bdcsr" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" Oct 03 09:06:10 crc kubenswrapper[4810]: I1003 09:06:10.399634 4810 scope.go:117] "RemoveContainer" containerID="46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84" Oct 03 09:06:10 crc kubenswrapper[4810]: E1003 09:06:10.399929 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5749954ff8-49msr_openstack(cf4d30b3-ffd4-4b08-a100-5f12e75e15df)\"" pod="openstack/heat-cfnapi-5749954ff8-49msr" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" Oct 03 09:06:12 crc kubenswrapper[4810]: I1003 09:06:12.303196 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-79b657474f-9khfh" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.126:8004/healthcheck\": read tcp 10.217.0.2:55302->10.217.1.126:8004: read: connection reset by peer" Oct 03 09:06:12 crc kubenswrapper[4810]: I1003 09:06:12.355094 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-c4766b898-w7pkh" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.127:8000/healthcheck\": read tcp 10.217.0.2:49600->10.217.1.127:8000: read: connection reset by peer" Oct 03 09:06:12 crc kubenswrapper[4810]: I1003 09:06:12.890105 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:06:12 crc kubenswrapper[4810]: I1003 09:06:12.898960 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.019818 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle\") pod \"18e853cb-f172-4ea6-b59b-cf75610b3eba\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.019880 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data\") pod \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.019959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data\") pod \"18e853cb-f172-4ea6-b59b-cf75610b3eba\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.020102 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle\") pod \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.020212 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdggz\" (UniqueName: \"kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz\") pod \"18e853cb-f172-4ea6-b59b-cf75610b3eba\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.020271 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom\") pod \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.020335 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgs8c\" (UniqueName: \"kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c\") pod \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\" (UID: \"a43dd7fd-90e2-4dfe-83de-1094e1b24a93\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.020357 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom\") pod \"18e853cb-f172-4ea6-b59b-cf75610b3eba\" (UID: \"18e853cb-f172-4ea6-b59b-cf75610b3eba\") " Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.027487 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a43dd7fd-90e2-4dfe-83de-1094e1b24a93" (UID: "a43dd7fd-90e2-4dfe-83de-1094e1b24a93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.027541 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c" (OuterVolumeSpecName: "kube-api-access-jgs8c") pod "a43dd7fd-90e2-4dfe-83de-1094e1b24a93" (UID: "a43dd7fd-90e2-4dfe-83de-1094e1b24a93"). InnerVolumeSpecName "kube-api-access-jgs8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.027572 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz" (OuterVolumeSpecName: "kube-api-access-vdggz") pod "18e853cb-f172-4ea6-b59b-cf75610b3eba" (UID: "18e853cb-f172-4ea6-b59b-cf75610b3eba"). InnerVolumeSpecName "kube-api-access-vdggz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.027585 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18e853cb-f172-4ea6-b59b-cf75610b3eba" (UID: "18e853cb-f172-4ea6-b59b-cf75610b3eba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.050255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a43dd7fd-90e2-4dfe-83de-1094e1b24a93" (UID: "a43dd7fd-90e2-4dfe-83de-1094e1b24a93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.053064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e853cb-f172-4ea6-b59b-cf75610b3eba" (UID: "18e853cb-f172-4ea6-b59b-cf75610b3eba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.071027 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data" (OuterVolumeSpecName: "config-data") pod "18e853cb-f172-4ea6-b59b-cf75610b3eba" (UID: "18e853cb-f172-4ea6-b59b-cf75610b3eba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.078498 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data" (OuterVolumeSpecName: "config-data") pod "a43dd7fd-90e2-4dfe-83de-1094e1b24a93" (UID: "a43dd7fd-90e2-4dfe-83de-1094e1b24a93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122879 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdggz\" (UniqueName: \"kubernetes.io/projected/18e853cb-f172-4ea6-b59b-cf75610b3eba-kube-api-access-vdggz\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122934 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122948 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgs8c\" (UniqueName: \"kubernetes.io/projected/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-kube-api-access-jgs8c\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122965 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122977 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.122990 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.123001 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e853cb-f172-4ea6-b59b-cf75610b3eba-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.123013 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43dd7fd-90e2-4dfe-83de-1094e1b24a93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.427689 4810 generic.go:334] "Generic (PLEG): container finished" podID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerID="573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5" exitCode=0 Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.427928 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c4766b898-w7pkh" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.428098 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c4766b898-w7pkh" event={"ID":"18e853cb-f172-4ea6-b59b-cf75610b3eba","Type":"ContainerDied","Data":"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5"} Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.428197 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c4766b898-w7pkh" event={"ID":"18e853cb-f172-4ea6-b59b-cf75610b3eba","Type":"ContainerDied","Data":"3682af873143dd810ad8d1b860d6859ee508b75adf7710c6cb159f475dc75111"} Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.428222 4810 scope.go:117] "RemoveContainer" containerID="573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.433592 4810 generic.go:334] "Generic (PLEG): container finished" podID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerID="ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de" exitCode=0 Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.433638 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b657474f-9khfh" event={"ID":"a43dd7fd-90e2-4dfe-83de-1094e1b24a93","Type":"ContainerDied","Data":"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de"} Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.433692 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79b657474f-9khfh" event={"ID":"a43dd7fd-90e2-4dfe-83de-1094e1b24a93","Type":"ContainerDied","Data":"578ed9fd80915ae8310641f10cc49d1ad485ed9cd536d93f4c6e7a89d2222863"} Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.435067 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79b657474f-9khfh" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.463108 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.471477 4810 scope.go:117] "RemoveContainer" containerID="573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5" Oct 03 09:06:13 crc kubenswrapper[4810]: E1003 09:06:13.472017 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5\": container with ID starting with 573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5 not found: ID does not exist" containerID="573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.472066 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5"} err="failed to get container status \"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5\": rpc error: code = NotFound desc = could not find container \"573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5\": container with ID starting with 573ab62f23c26fc1eb7f27b5a0395c65420ef8f7df8056a77bcee6cbf7c7dfa5 not found: ID does not exist" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.472103 4810 scope.go:117] "RemoveContainer" containerID="ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.478979 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-c4766b898-w7pkh"] Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.481360 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.488756 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-79b657474f-9khfh"] Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.546527 4810 scope.go:117] "RemoveContainer" containerID="ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de" Oct 03 09:06:13 crc kubenswrapper[4810]: E1003 09:06:13.547083 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de\": container with ID starting with ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de not found: ID does not exist" containerID="ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.547125 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de"} err="failed to get container status \"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de\": rpc error: code = NotFound desc = could not find container \"ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de\": container with ID starting with ee097080e52a60ba29f9ce732d13edbcdd6d893a07d5373fe7de91f5a82fc6de not found: ID does not exist" Oct 03 09:06:13 crc kubenswrapper[4810]: I1003 09:06:13.898682 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.117:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8443: connect: connection refused" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.056855 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.056925 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.102512 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.303666 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:06:15 crc kubenswrapper[4810]: E1003 09:06:15.304080 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.318506 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" path="/var/lib/kubelet/pods/18e853cb-f172-4ea6-b59b-cf75610b3eba/volumes" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.319186 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" path="/var/lib/kubelet/pods/a43dd7fd-90e2-4dfe-83de-1094e1b24a93/volumes" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.502230 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:15 crc kubenswrapper[4810]: I1003 09:06:15.559787 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.266041 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.472508 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzp6l" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="registry-server" containerID="cri-o://af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514" gracePeriod=2 Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.649702 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.720195 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.734487 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:17 crc kubenswrapper[4810]: I1003 09:06:17.799403 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.064559 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.137774 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content\") pod \"7205bc13-c57e-491b-908e-6b99171a41a5\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.137817 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities\") pod \"7205bc13-c57e-491b-908e-6b99171a41a5\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.137959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27p9q\" (UniqueName: \"kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q\") pod \"7205bc13-c57e-491b-908e-6b99171a41a5\" (UID: \"7205bc13-c57e-491b-908e-6b99171a41a5\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.140206 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities" (OuterVolumeSpecName: "utilities") pod "7205bc13-c57e-491b-908e-6b99171a41a5" (UID: "7205bc13-c57e-491b-908e-6b99171a41a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.152037 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q" (OuterVolumeSpecName: "kube-api-access-27p9q") pod "7205bc13-c57e-491b-908e-6b99171a41a5" (UID: "7205bc13-c57e-491b-908e-6b99171a41a5"). InnerVolumeSpecName "kube-api-access-27p9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.213325 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7205bc13-c57e-491b-908e-6b99171a41a5" (UID: "7205bc13-c57e-491b-908e-6b99171a41a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.240624 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.240732 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7205bc13-c57e-491b-908e-6b99171a41a5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.240801 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27p9q\" (UniqueName: \"kubernetes.io/projected/7205bc13-c57e-491b-908e-6b99171a41a5-kube-api-access-27p9q\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.243520 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.251097 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.342280 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data\") pod \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.342387 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data\") pod \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343119 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp2kg\" (UniqueName: \"kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg\") pod \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343188 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle\") pod \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343211 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qv4z\" (UniqueName: \"kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z\") pod \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343243 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle\") pod \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343309 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom\") pod \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\" (UID: \"cf4d30b3-ffd4-4b08-a100-5f12e75e15df\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.343369 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom\") pod \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\" (UID: \"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9\") " Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.347462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf4d30b3-ffd4-4b08-a100-5f12e75e15df" (UID: "cf4d30b3-ffd4-4b08-a100-5f12e75e15df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.347940 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg" (OuterVolumeSpecName: "kube-api-access-wp2kg") pod "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" (UID: "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9"). InnerVolumeSpecName "kube-api-access-wp2kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.347978 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z" (OuterVolumeSpecName: "kube-api-access-7qv4z") pod "cf4d30b3-ffd4-4b08-a100-5f12e75e15df" (UID: "cf4d30b3-ffd4-4b08-a100-5f12e75e15df"). InnerVolumeSpecName "kube-api-access-7qv4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.348807 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" (UID: "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.371641 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" (UID: "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.377440 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf4d30b3-ffd4-4b08-a100-5f12e75e15df" (UID: "cf4d30b3-ffd4-4b08-a100-5f12e75e15df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.396684 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data" (OuterVolumeSpecName: "config-data") pod "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" (UID: "ff240e38-d82c-4bd6-a1fd-1d58b2e063a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.397049 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data" (OuterVolumeSpecName: "config-data") pod "cf4d30b3-ffd4-4b08-a100-5f12e75e15df" (UID: "cf4d30b3-ffd4-4b08-a100-5f12e75e15df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.446470 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447085 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp2kg\" (UniqueName: \"kubernetes.io/projected/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-kube-api-access-wp2kg\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447104 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447114 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qv4z\" (UniqueName: \"kubernetes.io/projected/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-kube-api-access-7qv4z\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447124 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447133 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447142 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.447150 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf4d30b3-ffd4-4b08-a100-5f12e75e15df-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.502253 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5749954ff8-49msr" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.502248 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5749954ff8-49msr" event={"ID":"cf4d30b3-ffd4-4b08-a100-5f12e75e15df","Type":"ContainerDied","Data":"d4e4d3ce8860f6265699ee51a21d93e95fe2baf00b870ae83472820a0eb15d85"} Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.502421 4810 scope.go:117] "RemoveContainer" containerID="46048e6cfb6a5b283b9f6c9e95329fe53070ee18ca0255081c780ab7fa425b84" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.505624 4810 generic.go:334] "Generic (PLEG): container finished" podID="7205bc13-c57e-491b-908e-6b99171a41a5" containerID="af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514" exitCode=0 Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.505689 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerDied","Data":"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514"} Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.505715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzp6l" event={"ID":"7205bc13-c57e-491b-908e-6b99171a41a5","Type":"ContainerDied","Data":"af35bf9e979616c64d0afd2b7353447d52dcd54a13a2bbb3cb53961bb8768cbf"} Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.505783 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzp6l" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.508646 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-767dcd9844-bdcsr" event={"ID":"ff240e38-d82c-4bd6-a1fd-1d58b2e063a9","Type":"ContainerDied","Data":"48c8b09ff8654568e7089ddae06ea1d82ecf0122b3e1149690dd2a6efd130de2"} Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.508734 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-767dcd9844-bdcsr" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.535632 4810 scope.go:117] "RemoveContainer" containerID="af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.564817 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.570689 4810 scope.go:117] "RemoveContainer" containerID="70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.575500 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzp6l"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.596396 4810 scope.go:117] "RemoveContainer" containerID="6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.610043 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.629655 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-767dcd9844-bdcsr"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.646624 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.657614 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5749954ff8-49msr"] Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.672867 4810 scope.go:117] "RemoveContainer" containerID="af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514" Oct 03 09:06:18 crc kubenswrapper[4810]: E1003 09:06:18.673280 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514\": container with ID starting with af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514 not found: ID does not exist" containerID="af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.673319 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514"} err="failed to get container status \"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514\": rpc error: code = NotFound desc = could not find container \"af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514\": container with ID starting with af13d9c8fda7ff62ec578d2e137173c1e86dc63793714f4e0c2f1bf6ae209514 not found: ID does not exist" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.673346 4810 scope.go:117] "RemoveContainer" containerID="70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56" Oct 03 09:06:18 crc kubenswrapper[4810]: E1003 09:06:18.673574 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56\": container with ID starting with 70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56 not found: ID does not exist" containerID="70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.673601 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56"} err="failed to get container status \"70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56\": rpc error: code = NotFound desc = could not find container \"70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56\": container with ID starting with 70d86fda8509c89dc9da1b807245d5061bd701cf8a3416cec04e8d99be672c56 not found: ID does not exist" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.673616 4810 scope.go:117] "RemoveContainer" containerID="6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7" Oct 03 09:06:18 crc kubenswrapper[4810]: E1003 09:06:18.675250 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7\": container with ID starting with 6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7 not found: ID does not exist" containerID="6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.675284 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7"} err="failed to get container status \"6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7\": rpc error: code = NotFound desc = could not find container \"6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7\": container with ID starting with 6c72269ea927f4a0530c0aca16962546dfea952250d964ebdf396fb205b091a7 not found: ID does not exist" Oct 03 09:06:18 crc kubenswrapper[4810]: I1003 09:06:18.675300 4810 scope.go:117] "RemoveContainer" containerID="4698ab1c34768cfb3a1ba4e915059e3415c4880c5c17688f47835efeb4baf673" Oct 03 09:06:19 crc kubenswrapper[4810]: I1003 09:06:19.328829 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" path="/var/lib/kubelet/pods/7205bc13-c57e-491b-908e-6b99171a41a5/volumes" Oct 03 09:06:19 crc kubenswrapper[4810]: I1003 09:06:19.329955 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" path="/var/lib/kubelet/pods/cf4d30b3-ffd4-4b08-a100-5f12e75e15df/volumes" Oct 03 09:06:19 crc kubenswrapper[4810]: I1003 09:06:19.330572 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" path="/var/lib/kubelet/pods/ff240e38-d82c-4bd6-a1fd-1d58b2e063a9/volumes" Oct 03 09:06:22 crc kubenswrapper[4810]: I1003 09:06:22.752706 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-79b657474f-9khfh" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.126:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:06:22 crc kubenswrapper[4810]: I1003 09:06:22.785011 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-c4766b898-w7pkh" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.127:8000/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:06:23 crc kubenswrapper[4810]: I1003 09:06:23.897599 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-95fbdc968-9z87d" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.117:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.117:8443: connect: connection refused" Oct 03 09:06:23 crc kubenswrapper[4810]: I1003 09:06:23.898465 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:06:24 crc kubenswrapper[4810]: I1003 09:06:24.654155 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:06:24 crc kubenswrapper[4810]: I1003 09:06:24.699912 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:06:24 crc kubenswrapper[4810]: I1003 09:06:24.700396 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-767579c56f-rmc2j" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerName="heat-engine" containerID="cri-o://b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" gracePeriod=60 Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.130067 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.195955 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199085 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199155 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199345 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfnhn\" (UniqueName: \"kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199388 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs" (OuterVolumeSpecName: "logs") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199426 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199478 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.199561 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts\") pod \"7ac00941-8b00-47af-9183-b7578731f8eb\" (UID: \"7ac00941-8b00-47af-9183-b7578731f8eb\") " Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.200451 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ac00941-8b00-47af-9183-b7578731f8eb-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.219156 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn" (OuterVolumeSpecName: "kube-api-access-kfnhn") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "kube-api-access-kfnhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.223010 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.263605 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts" (OuterVolumeSpecName: "scripts") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.284187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.288071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data" (OuterVolumeSpecName: "config-data") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.302534 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.302574 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.302588 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ac00941-8b00-47af-9183-b7578731f8eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.302600 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.302612 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfnhn\" (UniqueName: \"kubernetes.io/projected/7ac00941-8b00-47af-9183-b7578731f8eb-kube-api-access-kfnhn\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.339187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7ac00941-8b00-47af-9183-b7578731f8eb" (UID: "7ac00941-8b00-47af-9183-b7578731f8eb"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.405555 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ac00941-8b00-47af-9183-b7578731f8eb-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.576500 4810 generic.go:334] "Generic (PLEG): container finished" podID="7ac00941-8b00-47af-9183-b7578731f8eb" containerID="2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74" exitCode=137 Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.576562 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-95fbdc968-9z87d" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.576553 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerDied","Data":"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74"} Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.576715 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-95fbdc968-9z87d" event={"ID":"7ac00941-8b00-47af-9183-b7578731f8eb","Type":"ContainerDied","Data":"aab253c2eedc274ca536239b0930ccdf45476f0d1aa0ea8cc319156fe04e8d8b"} Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.576738 4810 scope.go:117] "RemoveContainer" containerID="a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.613212 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.622559 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-95fbdc968-9z87d"] Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.745166 4810 scope.go:117] "RemoveContainer" containerID="2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.805685 4810 scope.go:117] "RemoveContainer" containerID="a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c" Oct 03 09:06:25 crc kubenswrapper[4810]: E1003 09:06:25.806351 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c\": container with ID starting with a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c not found: ID does not exist" containerID="a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.806384 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c"} err="failed to get container status \"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c\": rpc error: code = NotFound desc = could not find container \"a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c\": container with ID starting with a393dee02fc465ebe355db30e7c7120301efd614c5c0d65af60664501bec226c not found: ID does not exist" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.806406 4810 scope.go:117] "RemoveContainer" containerID="2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74" Oct 03 09:06:25 crc kubenswrapper[4810]: E1003 09:06:25.806911 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74\": container with ID starting with 2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74 not found: ID does not exist" containerID="2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74" Oct 03 09:06:25 crc kubenswrapper[4810]: I1003 09:06:25.806942 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74"} err="failed to get container status \"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74\": rpc error: code = NotFound desc = could not find container \"2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74\": container with ID starting with 2fe650d4fbbad5647466dabdfe2362d23d2e59846993ed920b56f19351e88b74 not found: ID does not exist" Oct 03 09:06:27 crc kubenswrapper[4810]: E1003 09:06:27.235452 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:06:27 crc kubenswrapper[4810]: E1003 09:06:27.237145 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:06:27 crc kubenswrapper[4810]: E1003 09:06:27.239139 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:06:27 crc kubenswrapper[4810]: E1003 09:06:27.239177 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-767579c56f-rmc2j" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerName="heat-engine" Oct 03 09:06:27 crc kubenswrapper[4810]: I1003 09:06:27.339703 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" path="/var/lib/kubelet/pods/7ac00941-8b00-47af-9183-b7578731f8eb/volumes" Oct 03 09:06:28 crc kubenswrapper[4810]: I1003 09:06:28.303943 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:06:28 crc kubenswrapper[4810]: E1003 09:06:28.304555 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.171553 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.253160 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data\") pod \"e1977444-b7fb-4943-8079-d7c97d32a5ca\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.253220 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom\") pod \"e1977444-b7fb-4943-8079-d7c97d32a5ca\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.253301 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle\") pod \"e1977444-b7fb-4943-8079-d7c97d32a5ca\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.253506 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67qkg\" (UniqueName: \"kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg\") pod \"e1977444-b7fb-4943-8079-d7c97d32a5ca\" (UID: \"e1977444-b7fb-4943-8079-d7c97d32a5ca\") " Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.258944 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg" (OuterVolumeSpecName: "kube-api-access-67qkg") pod "e1977444-b7fb-4943-8079-d7c97d32a5ca" (UID: "e1977444-b7fb-4943-8079-d7c97d32a5ca"). InnerVolumeSpecName "kube-api-access-67qkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.261071 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1977444-b7fb-4943-8079-d7c97d32a5ca" (UID: "e1977444-b7fb-4943-8079-d7c97d32a5ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.313105 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1977444-b7fb-4943-8079-d7c97d32a5ca" (UID: "e1977444-b7fb-4943-8079-d7c97d32a5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.342739 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data" (OuterVolumeSpecName: "config-data") pod "e1977444-b7fb-4943-8079-d7c97d32a5ca" (UID: "e1977444-b7fb-4943-8079-d7c97d32a5ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.355704 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.355742 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.355755 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1977444-b7fb-4943-8079-d7c97d32a5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.355765 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67qkg\" (UniqueName: \"kubernetes.io/projected/e1977444-b7fb-4943-8079-d7c97d32a5ca-kube-api-access-67qkg\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.645751 4810 generic.go:334] "Generic (PLEG): container finished" podID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" exitCode=0 Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.646167 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-767579c56f-rmc2j" event={"ID":"e1977444-b7fb-4943-8079-d7c97d32a5ca","Type":"ContainerDied","Data":"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec"} Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.646314 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-767579c56f-rmc2j" event={"ID":"e1977444-b7fb-4943-8079-d7c97d32a5ca","Type":"ContainerDied","Data":"38646f2ee41a0cf083362fc0ea524b7b0eb5cacc6a87ffa240fab81f4a8a1feb"} Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.646232 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-767579c56f-rmc2j" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.646377 4810 scope.go:117] "RemoveContainer" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.694510 4810 scope.go:117] "RemoveContainer" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" Oct 03 09:06:32 crc kubenswrapper[4810]: E1003 09:06:32.698401 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec\": container with ID starting with b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec not found: ID does not exist" containerID="b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.698462 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec"} err="failed to get container status \"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec\": rpc error: code = NotFound desc = could not find container \"b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec\": container with ID starting with b708a484413898957555a25780f9a9f1a308663cc44b5e0474021ed2f8f27cec not found: ID does not exist" Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.704045 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:06:32 crc kubenswrapper[4810]: I1003 09:06:32.736565 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-767579c56f-rmc2j"] Oct 03 09:06:33 crc kubenswrapper[4810]: I1003 09:06:33.315109 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" path="/var/lib/kubelet/pods/e1977444-b7fb-4943-8079-d7c97d32a5ca/volumes" Oct 03 09:06:41 crc kubenswrapper[4810]: I1003 09:06:41.302588 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:06:41 crc kubenswrapper[4810]: E1003 09:06:41.303503 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.557152 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs"] Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558031 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="extract-content" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558049 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="extract-content" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558067 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558075 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558084 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558092 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558110 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558117 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558133 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558140 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558155 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="extract-utilities" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558163 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="extract-utilities" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558184 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558191 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558205 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558213 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558268 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon-log" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558276 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon-log" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558285 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="registry-server" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558292 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="registry-server" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558307 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558315 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: E1003 09:06:42.558331 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerName="heat-engine" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558339 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerName="heat-engine" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558552 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558567 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1977444-b7fb-4943-8079-d7c97d32a5ca" containerName="heat-engine" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558585 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43dd7fd-90e2-4dfe-83de-1094e1b24a93" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558601 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558635 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558650 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e853cb-f172-4ea6-b59b-cf75610b3eba" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558666 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac00941-8b00-47af-9183-b7578731f8eb" containerName="horizon-log" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.558681 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7205bc13-c57e-491b-908e-6b99171a41a5" containerName="registry-server" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.559147 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff240e38-d82c-4bd6-a1fd-1d58b2e063a9" containerName="heat-api" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.559167 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4d30b3-ffd4-4b08-a100-5f12e75e15df" containerName="heat-cfnapi" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.560609 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.563187 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.576539 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs"] Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.673963 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.674019 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddf5\" (UniqueName: \"kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.674087 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.775723 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.775778 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddf5\" (UniqueName: \"kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.775860 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.776285 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.776298 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.793884 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddf5\" (UniqueName: \"kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:42 crc kubenswrapper[4810]: I1003 09:06:42.885991 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:43 crc kubenswrapper[4810]: I1003 09:06:43.388269 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs"] Oct 03 09:06:43 crc kubenswrapper[4810]: I1003 09:06:43.745560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerStarted","Data":"0a5a54009f266ce2fe57cf351526294964ba36a8c11edd4a21293b50b1c384b5"} Oct 03 09:06:43 crc kubenswrapper[4810]: I1003 09:06:43.745969 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerStarted","Data":"5ec3eb584175788b3308b6b415b88bd23a7685268e6f61a555f5f38a9a59afd0"} Oct 03 09:06:44 crc kubenswrapper[4810]: I1003 09:06:44.757495 4810 generic.go:334] "Generic (PLEG): container finished" podID="f7564445-9319-4f4c-8b03-4d98503a9164" containerID="0a5a54009f266ce2fe57cf351526294964ba36a8c11edd4a21293b50b1c384b5" exitCode=0 Oct 03 09:06:44 crc kubenswrapper[4810]: I1003 09:06:44.757545 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerDied","Data":"0a5a54009f266ce2fe57cf351526294964ba36a8c11edd4a21293b50b1c384b5"} Oct 03 09:06:44 crc kubenswrapper[4810]: I1003 09:06:44.761233 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:06:46 crc kubenswrapper[4810]: I1003 09:06:46.776840 4810 generic.go:334] "Generic (PLEG): container finished" podID="f7564445-9319-4f4c-8b03-4d98503a9164" containerID="2a5dbe3ef7208c9167229a7b831476aba24555034cfb66a1e0b4fb860af1ffd6" exitCode=0 Oct 03 09:06:46 crc kubenswrapper[4810]: I1003 09:06:46.776912 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerDied","Data":"2a5dbe3ef7208c9167229a7b831476aba24555034cfb66a1e0b4fb860af1ffd6"} Oct 03 09:06:47 crc kubenswrapper[4810]: I1003 09:06:47.792777 4810 generic.go:334] "Generic (PLEG): container finished" podID="f7564445-9319-4f4c-8b03-4d98503a9164" containerID="3f2448e867831c1ec5b6d11b2ed33a01807004cb22f0b8a873e4e8f0bd155764" exitCode=0 Oct 03 09:06:47 crc kubenswrapper[4810]: I1003 09:06:47.792848 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerDied","Data":"3f2448e867831c1ec5b6d11b2ed33a01807004cb22f0b8a873e4e8f0bd155764"} Oct 03 09:06:48 crc kubenswrapper[4810]: E1003 09:06:48.926242 4810 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.124467 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.203270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util\") pod \"f7564445-9319-4f4c-8b03-4d98503a9164\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.203375 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle\") pod \"f7564445-9319-4f4c-8b03-4d98503a9164\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.203521 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddf5\" (UniqueName: \"kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5\") pod \"f7564445-9319-4f4c-8b03-4d98503a9164\" (UID: \"f7564445-9319-4f4c-8b03-4d98503a9164\") " Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.205567 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle" (OuterVolumeSpecName: "bundle") pod "f7564445-9319-4f4c-8b03-4d98503a9164" (UID: "f7564445-9319-4f4c-8b03-4d98503a9164"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.208618 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5" (OuterVolumeSpecName: "kube-api-access-tddf5") pod "f7564445-9319-4f4c-8b03-4d98503a9164" (UID: "f7564445-9319-4f4c-8b03-4d98503a9164"). InnerVolumeSpecName "kube-api-access-tddf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.215281 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util" (OuterVolumeSpecName: "util") pod "f7564445-9319-4f4c-8b03-4d98503a9164" (UID: "f7564445-9319-4f4c-8b03-4d98503a9164"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.305674 4810 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-util\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.305711 4810 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f7564445-9319-4f4c-8b03-4d98503a9164-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.305723 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tddf5\" (UniqueName: \"kubernetes.io/projected/f7564445-9319-4f4c-8b03-4d98503a9164-kube-api-access-tddf5\") on node \"crc\" DevicePath \"\"" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.820209 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" event={"ID":"f7564445-9319-4f4c-8b03-4d98503a9164","Type":"ContainerDied","Data":"5ec3eb584175788b3308b6b415b88bd23a7685268e6f61a555f5f38a9a59afd0"} Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.820262 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ec3eb584175788b3308b6b415b88bd23a7685268e6f61a555f5f38a9a59afd0" Oct 03 09:06:49 crc kubenswrapper[4810]: I1003 09:06:49.820299 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs" Oct 03 09:06:54 crc kubenswrapper[4810]: I1003 09:06:54.050399 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lrm7c"] Oct 03 09:06:54 crc kubenswrapper[4810]: I1003 09:06:54.064689 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lrm7c"] Oct 03 09:06:54 crc kubenswrapper[4810]: I1003 09:06:54.302603 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:06:54 crc kubenswrapper[4810]: E1003 09:06:54.302941 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:06:55 crc kubenswrapper[4810]: I1003 09:06:55.321284 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e196d4f5-94ac-4345-8231-4b7e71f0ea1d" path="/var/lib/kubelet/pods/e196d4f5-94ac-4345-8231-4b7e71f0ea1d/volumes" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.605692 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2"] Oct 03 09:07:01 crc kubenswrapper[4810]: E1003 09:07:01.606652 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="extract" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.606667 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="extract" Oct 03 09:07:01 crc kubenswrapper[4810]: E1003 09:07:01.606702 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="pull" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.606709 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="pull" Oct 03 09:07:01 crc kubenswrapper[4810]: E1003 09:07:01.606724 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="util" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.606731 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="util" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.606975 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7564445-9319-4f4c-8b03-4d98503a9164" containerName="extract" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.607677 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.615797 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-tpprk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.616021 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.619337 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.619852 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.675492 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69x2\" (UniqueName: \"kubernetes.io/projected/8cea4c70-0056-479e-8759-eda2be569ee6-kube-api-access-n69x2\") pod \"obo-prometheus-operator-7c8cf85677-kg8d2\" (UID: \"8cea4c70-0056-479e-8759-eda2be569ee6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.720957 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.722710 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.724385 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.726539 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-f2krp" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.733335 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.735202 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.758881 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.777345 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.777489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.777582 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.777739 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69x2\" (UniqueName: \"kubernetes.io/projected/8cea4c70-0056-479e-8759-eda2be569ee6-kube-api-access-n69x2\") pod \"obo-prometheus-operator-7c8cf85677-kg8d2\" (UID: \"8cea4c70-0056-479e-8759-eda2be569ee6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.777809 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.793196 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.821863 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69x2\" (UniqueName: \"kubernetes.io/projected/8cea4c70-0056-479e-8759-eda2be569ee6-kube-api-access-n69x2\") pod \"obo-prometheus-operator-7c8cf85677-kg8d2\" (UID: \"8cea4c70-0056-479e-8759-eda2be569ee6\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.874759 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-krn6x"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.876155 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879435 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879505 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879560 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879617 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879662 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-cr8j4" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.879934 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.889007 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.891879 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.891965 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/10a017e0-9508-4ba6-8107-f7eda7c39515-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk\" (UID: \"10a017e0-9508-4ba6-8107-f7eda7c39515\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.911470 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-krn6x"] Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.912503 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/322bc19c-a989-4073-bd2d-166be2c156d9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-68d8dd4df-nq646\" (UID: \"322bc19c-a989-4073-bd2d-166be2c156d9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.935214 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.981539 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/81f029f4-aa74-4b25-a6b5-b50a5933feec-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:01 crc kubenswrapper[4810]: I1003 09:07:01.981677 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/81f029f4-aa74-4b25-a6b5-b50a5933feec-kube-api-access-ktghv\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.036848 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-lsjsz"] Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.038690 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.040819 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-wj4bh" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.045747 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.066328 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-lsjsz"] Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.083490 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.084159 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/81f029f4-aa74-4b25-a6b5-b50a5933feec-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.084212 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdfvh\" (UniqueName: \"kubernetes.io/projected/70cc7fd9-2b69-41fc-be41-569b6c564807-kube-api-access-tdfvh\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.084257 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70cc7fd9-2b69-41fc-be41-569b6c564807-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.084325 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/81f029f4-aa74-4b25-a6b5-b50a5933feec-kube-api-access-ktghv\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.092025 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/81f029f4-aa74-4b25-a6b5-b50a5933feec-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.126971 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktghv\" (UniqueName: \"kubernetes.io/projected/81f029f4-aa74-4b25-a6b5-b50a5933feec-kube-api-access-ktghv\") pod \"observability-operator-cc5f78dfc-krn6x\" (UID: \"81f029f4-aa74-4b25-a6b5-b50a5933feec\") " pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.187380 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdfvh\" (UniqueName: \"kubernetes.io/projected/70cc7fd9-2b69-41fc-be41-569b6c564807-kube-api-access-tdfvh\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.187443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70cc7fd9-2b69-41fc-be41-569b6c564807-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.188330 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/70cc7fd9-2b69-41fc-be41-569b6c564807-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.205341 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdfvh\" (UniqueName: \"kubernetes.io/projected/70cc7fd9-2b69-41fc-be41-569b6c564807-kube-api-access-tdfvh\") pod \"perses-operator-54bc95c9fb-lsjsz\" (UID: \"70cc7fd9-2b69-41fc-be41-569b6c564807\") " pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.396934 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.430877 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.821073 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2"] Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.831305 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk"] Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.976526 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646"] Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.989540 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" event={"ID":"8cea4c70-0056-479e-8759-eda2be569ee6","Type":"ContainerStarted","Data":"277b558b273b2297925751aa1f01c54c70db98171566a6703f5e1d1165c19633"} Oct 03 09:07:02 crc kubenswrapper[4810]: I1003 09:07:02.992852 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" event={"ID":"10a017e0-9508-4ba6-8107-f7eda7c39515","Type":"ContainerStarted","Data":"210afe298903e811f903dd6f1c4066fece34bc4de7aeef2cb75465c52ce81eae"} Oct 03 09:07:03 crc kubenswrapper[4810]: I1003 09:07:03.089155 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-lsjsz"] Oct 03 09:07:03 crc kubenswrapper[4810]: I1003 09:07:03.182235 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-krn6x"] Oct 03 09:07:04 crc kubenswrapper[4810]: I1003 09:07:04.007686 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" event={"ID":"70cc7fd9-2b69-41fc-be41-569b6c564807","Type":"ContainerStarted","Data":"eda1448c26c6a56ea6c20770a74cdfaa00602ee7adac04dd820b4ebd24d8c809"} Oct 03 09:07:04 crc kubenswrapper[4810]: I1003 09:07:04.013046 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" event={"ID":"81f029f4-aa74-4b25-a6b5-b50a5933feec","Type":"ContainerStarted","Data":"9524e13b0884b1c6a8197de5497f5e314236eb2144d7932d71cdb9e06d708b1b"} Oct 03 09:07:04 crc kubenswrapper[4810]: I1003 09:07:04.017103 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" event={"ID":"322bc19c-a989-4073-bd2d-166be2c156d9","Type":"ContainerStarted","Data":"0e0ebecf02fce3f366e6688ee1d5a627f69ef14045f7a4ece02123bab7c2acc8"} Oct 03 09:07:04 crc kubenswrapper[4810]: I1003 09:07:04.043967 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4b2d-account-create-5hn74"] Oct 03 09:07:04 crc kubenswrapper[4810]: I1003 09:07:04.059810 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4b2d-account-create-5hn74"] Oct 03 09:07:05 crc kubenswrapper[4810]: I1003 09:07:05.318529 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da33a722-8402-4b5c-acce-d1c1c974278c" path="/var/lib/kubelet/pods/da33a722-8402-4b5c-acce-d1c1c974278c/volumes" Oct 03 09:07:08 crc kubenswrapper[4810]: I1003 09:07:08.303397 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:07:08 crc kubenswrapper[4810]: E1003 09:07:08.304026 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.190492 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" event={"ID":"70cc7fd9-2b69-41fc-be41-569b6c564807","Type":"ContainerStarted","Data":"48a4147f89566cca3d81f4d3c21b4d458013345f288aeb034c12f7199e370f82"} Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.191175 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.193262 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" event={"ID":"81f029f4-aa74-4b25-a6b5-b50a5933feec","Type":"ContainerStarted","Data":"45da6d09a6e39ceb40a82888c0eaaf93fedbcc2ab80291a907cae8ec374d0f1a"} Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.194483 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.199066 4810 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-krn6x container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.138:8081/healthz\": dial tcp 10.217.1.138:8081: connect: connection refused" start-of-body= Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.199162 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" podUID="81f029f4-aa74-4b25-a6b5-b50a5933feec" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.138:8081/healthz\": dial tcp 10.217.1.138:8081: connect: connection refused" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.200816 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" event={"ID":"10a017e0-9508-4ba6-8107-f7eda7c39515","Type":"ContainerStarted","Data":"8986dc133e78bed79b014504d2689f07e4fd3f6a8360c06366bc5c10812f1d43"} Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.203104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" event={"ID":"322bc19c-a989-4073-bd2d-166be2c156d9","Type":"ContainerStarted","Data":"243e40c820c9628c66d3c3ea6b45d515b1a581c4c5f730b13d3d80db1de42e5a"} Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.227655 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" podStartSLOduration=1.873954449 podStartE2EDuration="14.227624636s" podCreationTimestamp="2025-10-03 09:07:02 +0000 UTC" firstStartedPulling="2025-10-03 09:07:03.10143287 +0000 UTC m=+7856.528683605" lastFinishedPulling="2025-10-03 09:07:15.455103047 +0000 UTC m=+7868.882353792" observedRunningTime="2025-10-03 09:07:16.217540577 +0000 UTC m=+7869.644791322" watchObservedRunningTime="2025-10-03 09:07:16.227624636 +0000 UTC m=+7869.654875381" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.292856 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" podStartSLOduration=2.821466255 podStartE2EDuration="15.292819357s" podCreationTimestamp="2025-10-03 09:07:01 +0000 UTC" firstStartedPulling="2025-10-03 09:07:03.20102276 +0000 UTC m=+7856.628273495" lastFinishedPulling="2025-10-03 09:07:15.672375862 +0000 UTC m=+7869.099626597" observedRunningTime="2025-10-03 09:07:16.242244226 +0000 UTC m=+7869.669494971" watchObservedRunningTime="2025-10-03 09:07:16.292819357 +0000 UTC m=+7869.720070092" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.301350 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk" podStartSLOduration=2.702034366 podStartE2EDuration="15.301323345s" podCreationTimestamp="2025-10-03 09:07:01 +0000 UTC" firstStartedPulling="2025-10-03 09:07:02.845018729 +0000 UTC m=+7856.272269464" lastFinishedPulling="2025-10-03 09:07:15.444307708 +0000 UTC m=+7868.871558443" observedRunningTime="2025-10-03 09:07:16.283655793 +0000 UTC m=+7869.710906548" watchObservedRunningTime="2025-10-03 09:07:16.301323345 +0000 UTC m=+7869.728574080" Oct 03 09:07:16 crc kubenswrapper[4810]: I1003 09:07:16.361695 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-68d8dd4df-nq646" podStartSLOduration=2.894467626 podStartE2EDuration="15.361674647s" podCreationTimestamp="2025-10-03 09:07:01 +0000 UTC" firstStartedPulling="2025-10-03 09:07:02.986849948 +0000 UTC m=+7856.414100683" lastFinishedPulling="2025-10-03 09:07:15.454056959 +0000 UTC m=+7868.881307704" observedRunningTime="2025-10-03 09:07:16.336078443 +0000 UTC m=+7869.763329178" watchObservedRunningTime="2025-10-03 09:07:16.361674647 +0000 UTC m=+7869.788925382" Oct 03 09:07:17 crc kubenswrapper[4810]: I1003 09:07:17.217645 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" event={"ID":"8cea4c70-0056-479e-8759-eda2be569ee6","Type":"ContainerStarted","Data":"c5b4b54986966bc1dea066bd9e4ea6c9a33dc37d3d5fa25d6d4bc82c3c3e64b3"} Oct 03 09:07:17 crc kubenswrapper[4810]: I1003 09:07:17.223136 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-krn6x" Oct 03 09:07:17 crc kubenswrapper[4810]: I1003 09:07:17.254003 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-kg8d2" podStartSLOduration=3.631192978 podStartE2EDuration="16.253972276s" podCreationTimestamp="2025-10-03 09:07:01 +0000 UTC" firstStartedPulling="2025-10-03 09:07:02.861084628 +0000 UTC m=+7856.288335363" lastFinishedPulling="2025-10-03 09:07:15.483863916 +0000 UTC m=+7868.911114661" observedRunningTime="2025-10-03 09:07:17.244872462 +0000 UTC m=+7870.672123197" watchObservedRunningTime="2025-10-03 09:07:17.253972276 +0000 UTC m=+7870.681223021" Oct 03 09:07:21 crc kubenswrapper[4810]: I1003 09:07:21.306186 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:07:21 crc kubenswrapper[4810]: E1003 09:07:21.307014 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:07:22 crc kubenswrapper[4810]: I1003 09:07:22.434999 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-lsjsz" Oct 03 09:07:23 crc kubenswrapper[4810]: I1003 09:07:23.499854 4810 scope.go:117] "RemoveContainer" containerID="ef9a092d8564e390c88525c36f006b321ef2f7ce62522db6f48f417ef6ff5929" Oct 03 09:07:23 crc kubenswrapper[4810]: I1003 09:07:23.530688 4810 scope.go:117] "RemoveContainer" containerID="bc3a0f56765ba9c994d5d365132eda2046effd5dc5d39bff5bc874426f4b6f6c" Oct 03 09:07:24 crc kubenswrapper[4810]: I1003 09:07:24.989881 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 09:07:24 crc kubenswrapper[4810]: I1003 09:07:24.990568 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" containerName="openstackclient" containerID="cri-o://4a754a9416c76b5ff8b30d2267dfe35883388f775a4427683dcd949007287c65" gracePeriod=2 Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.035821 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.136138 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 09:07:25 crc kubenswrapper[4810]: E1003 09:07:25.136666 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" containerName="openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.136689 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" containerName="openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.136980 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" containerName="openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.137914 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.164596 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.164728 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjpq\" (UniqueName: \"kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.164800 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.164821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.180780 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.267323 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.267377 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.267439 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.268612 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjpq\" (UniqueName: \"kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.269684 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.274853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.274904 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" podUID="a0caf621-8402-40a0-90e3-e70455734a82" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.278758 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.349735 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjpq\" (UniqueName: \"kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq\") pod \"openstackclient\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.484034 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.683438 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.685193 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.692194 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-v4ch5" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.698376 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.790378 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t8r4\" (UniqueName: \"kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4\") pod \"kube-state-metrics-0\" (UID: \"a2973c82-79f4-492a-b9c8-414dae53b759\") " pod="openstack/kube-state-metrics-0" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.897788 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t8r4\" (UniqueName: \"kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4\") pod \"kube-state-metrics-0\" (UID: \"a2973c82-79f4-492a-b9c8-414dae53b759\") " pod="openstack/kube-state-metrics-0" Oct 03 09:07:25 crc kubenswrapper[4810]: I1003 09:07:25.946760 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t8r4\" (UniqueName: \"kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4\") pod \"kube-state-metrics-0\" (UID: \"a2973c82-79f4-492a-b9c8-414dae53b759\") " pod="openstack/kube-state-metrics-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.063554 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.262348 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.276430 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.282283 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-sw9vj" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.282514 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.282674 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.282819 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.314954 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.315912 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.316013 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.316044 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.316088 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.316144 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2w2w\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.316177 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419172 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419274 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2w2w\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419321 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419488 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419626 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.419677 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.430982 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.431683 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.447005 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.464232 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.467818 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.476188 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2w2w\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w\") pod \"alertmanager-metric-storage-0\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.586952 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.674501 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.891787 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.911009 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.911121 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.923553 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.923805 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.923961 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.924099 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.924232 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.924446 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-h9hmj" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972509 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972568 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972660 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972700 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsfsr\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972765 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972791 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972835 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:26 crc kubenswrapper[4810]: I1003 09:07:26.972920 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.003416 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.091547 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092169 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092228 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092367 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092422 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsfsr\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092523 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092560 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.092729 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.095072 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.100359 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.104490 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.105098 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.134352 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.135252 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsfsr\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.168149 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.172725 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.172783 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d66854587727a0fa130476e16e5761c8f5e5e94bddb4dd762c3ee28691e2c088/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.402522 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0caf621-8402-40a0-90e3-e70455734a82","Type":"ContainerStarted","Data":"3498b199bccc33619d4d243e52a4a293e9d0d6222f6c75db10ae41ae3976a1f7"} Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.427364 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.442208 4810 generic.go:334] "Generic (PLEG): container finished" podID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" containerID="4a754a9416c76b5ff8b30d2267dfe35883388f775a4427683dcd949007287c65" exitCode=137 Oct 03 09:07:27 crc kubenswrapper[4810]: W1003 09:07:27.570680 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2c3e51f_abd0_4a2f_870f_ad7a5d16c03d.slice/crio-203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011 WatchSource:0}: Error finding container 203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011: Status 404 returned error can't find the container with id 203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011 Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.571407 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a2973c82-79f4-492a-b9c8-414dae53b759","Type":"ContainerStarted","Data":"7173819b9a920c2eaba4795403e6e279b33f89bc56c11cf9b83a167ec57fcc79"} Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.748459 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:27 crc kubenswrapper[4810]: I1003 09:07:27.873858 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.061744 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.065984 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" podUID="a0caf621-8402-40a0-90e3-e70455734a82" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.090389 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config\") pod \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.090502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k64lg\" (UniqueName: \"kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg\") pod \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.090602 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret\") pod \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.090773 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle\") pod \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\" (UID: \"07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e\") " Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.116291 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg" (OuterVolumeSpecName: "kube-api-access-k64lg") pod "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" (UID: "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e"). InnerVolumeSpecName "kube-api-access-k64lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.151022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" (UID: "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.167059 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" (UID: "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.203372 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.203403 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.203415 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k64lg\" (UniqueName: \"kubernetes.io/projected/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-kube-api-access-k64lg\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.207200 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" (UID: "07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.306373 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.569206 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.596473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a2973c82-79f4-492a-b9c8-414dae53b759","Type":"ContainerStarted","Data":"a09e0db5b2095a446049e0ff9772dc37b61775dbc2bf1b492b79fdf166addfae"} Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.598180 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerStarted","Data":"203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011"} Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.599539 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a0caf621-8402-40a0-90e3-e70455734a82","Type":"ContainerStarted","Data":"15874275b3a2597d983a16f3a24a3d8a1f72f8482b5ea12528bb53e8dbd99e80"} Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.608371 4810 scope.go:117] "RemoveContainer" containerID="4a754a9416c76b5ff8b30d2267dfe35883388f775a4427683dcd949007287c65" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.608432 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.622579 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.622554197 podStartE2EDuration="3.622554197s" podCreationTimestamp="2025-10-03 09:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:07:28.621261712 +0000 UTC m=+7882.048512447" watchObservedRunningTime="2025-10-03 09:07:28.622554197 +0000 UTC m=+7882.049804942" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.625100 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" podUID="a0caf621-8402-40a0-90e3-e70455734a82" Oct 03 09:07:28 crc kubenswrapper[4810]: I1003 09:07:28.653105 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" podUID="a0caf621-8402-40a0-90e3-e70455734a82" Oct 03 09:07:29 crc kubenswrapper[4810]: I1003 09:07:29.324079 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e" path="/var/lib/kubelet/pods/07613d35-a1a6-4e9b-8a5b-a4e27c33ae3e/volumes" Oct 03 09:07:29 crc kubenswrapper[4810]: I1003 09:07:29.625462 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerStarted","Data":"672e38624c2298cef4fe926b04d2bce26262247eb0d21c0e2e7389c97b0ea185"} Oct 03 09:07:29 crc kubenswrapper[4810]: I1003 09:07:29.627082 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 09:07:29 crc kubenswrapper[4810]: I1003 09:07:29.658715 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.590746038 podStartE2EDuration="4.658696488s" podCreationTimestamp="2025-10-03 09:07:25 +0000 UTC" firstStartedPulling="2025-10-03 09:07:27.116139532 +0000 UTC m=+7880.543390267" lastFinishedPulling="2025-10-03 09:07:28.184089982 +0000 UTC m=+7881.611340717" observedRunningTime="2025-10-03 09:07:29.653226882 +0000 UTC m=+7883.080477617" watchObservedRunningTime="2025-10-03 09:07:29.658696488 +0000 UTC m=+7883.085947223" Oct 03 09:07:34 crc kubenswrapper[4810]: I1003 09:07:34.676917 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerStarted","Data":"fb806aa683711d50318108dee25c7aaf9c917a8e2dde745bb07d8f2c6a7ee8d7"} Oct 03 09:07:34 crc kubenswrapper[4810]: I1003 09:07:34.679802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerStarted","Data":"35ebea1b4f71fc0c6c9c8ae4c6ed23990e2f042b27e1024559087da107695952"} Oct 03 09:07:35 crc kubenswrapper[4810]: I1003 09:07:35.303718 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:07:35 crc kubenswrapper[4810]: E1003 09:07:35.304390 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:07:36 crc kubenswrapper[4810]: I1003 09:07:36.072389 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 09:07:41 crc kubenswrapper[4810]: I1003 09:07:41.746324 4810 generic.go:334] "Generic (PLEG): container finished" podID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerID="35ebea1b4f71fc0c6c9c8ae4c6ed23990e2f042b27e1024559087da107695952" exitCode=0 Oct 03 09:07:41 crc kubenswrapper[4810]: I1003 09:07:41.746425 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerDied","Data":"35ebea1b4f71fc0c6c9c8ae4c6ed23990e2f042b27e1024559087da107695952"} Oct 03 09:07:42 crc kubenswrapper[4810]: I1003 09:07:42.758361 4810 generic.go:334] "Generic (PLEG): container finished" podID="349e11b6-cc0d-494f-8a16-e8551416b426" containerID="fb806aa683711d50318108dee25c7aaf9c917a8e2dde745bb07d8f2c6a7ee8d7" exitCode=0 Oct 03 09:07:42 crc kubenswrapper[4810]: I1003 09:07:42.758411 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerDied","Data":"fb806aa683711d50318108dee25c7aaf9c917a8e2dde745bb07d8f2c6a7ee8d7"} Oct 03 09:07:46 crc kubenswrapper[4810]: I1003 09:07:46.798967 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerStarted","Data":"66f1ba2e75970a1aba16f1afb2a73d7784ddebbe5e4717cbfe405fc0b9fc68dd"} Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.043864 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-5k6fj"] Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.056082 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-5k6fj"] Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.320151 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bda40fe-b70d-45a5-a292-a1a27fae5e10" path="/var/lib/kubelet/pods/4bda40fe-b70d-45a5-a292-a1a27fae5e10/volumes" Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.835021 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerStarted","Data":"43e7dc7eb5e00cbb5009d4a66ffb6aa6e9fffc55143b955e5c66ac08125d7298"} Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.835739 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.838042 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:07:49 crc kubenswrapper[4810]: I1003 09:07:49.869622 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.028673063 podStartE2EDuration="23.869601077s" podCreationTimestamp="2025-10-03 09:07:26 +0000 UTC" firstStartedPulling="2025-10-03 09:07:27.730070873 +0000 UTC m=+7881.157321608" lastFinishedPulling="2025-10-03 09:07:45.570998887 +0000 UTC m=+7898.998249622" observedRunningTime="2025-10-03 09:07:49.861556203 +0000 UTC m=+7903.288806938" watchObservedRunningTime="2025-10-03 09:07:49.869601077 +0000 UTC m=+7903.296851812" Oct 03 09:07:50 crc kubenswrapper[4810]: I1003 09:07:50.302515 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:07:50 crc kubenswrapper[4810]: E1003 09:07:50.303071 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:07:50 crc kubenswrapper[4810]: I1003 09:07:50.844854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerStarted","Data":"822c9c3094ef99ba7dddecfe37fdf63df80d8eb24f66b4f6d0d48efa7af54852"} Oct 03 09:07:53 crc kubenswrapper[4810]: I1003 09:07:53.874667 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerStarted","Data":"40fb33ac90113ee47eec1939ef7f50051da823527b857661001a884f9fefb1d8"} Oct 03 09:07:58 crc kubenswrapper[4810]: I1003 09:07:58.946312 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerStarted","Data":"f03519b1ec60783954e5cc8ad288cfd5848558733f08b56079a2b7df0d460399"} Oct 03 09:07:58 crc kubenswrapper[4810]: I1003 09:07:58.997492 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.044871019 podStartE2EDuration="33.997472876s" podCreationTimestamp="2025-10-03 09:07:25 +0000 UTC" firstStartedPulling="2025-10-03 09:07:28.587203302 +0000 UTC m=+7882.014454037" lastFinishedPulling="2025-10-03 09:07:58.539805159 +0000 UTC m=+7911.967055894" observedRunningTime="2025-10-03 09:07:58.989966085 +0000 UTC m=+7912.417216850" watchObservedRunningTime="2025-10-03 09:07:58.997472876 +0000 UTC m=+7912.424723611" Oct 03 09:08:02 crc kubenswrapper[4810]: I1003 09:08:02.874674 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:03 crc kubenswrapper[4810]: I1003 09:08:03.304404 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:08:03 crc kubenswrapper[4810]: E1003 09:08:03.304990 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.535170 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.538862 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.541004 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.548805 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.554773 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.643304 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.643394 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.643474 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.643702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8b4f\" (UniqueName: \"kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.643824 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.644358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.644702 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.746773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.746872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.746922 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.746986 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.747053 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b4f\" (UniqueName: \"kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.747097 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.747134 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.748762 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.748942 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.754797 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.755698 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.774945 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.778478 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8b4f\" (UniqueName: \"kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.778789 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data\") pod \"ceilometer-0\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " pod="openstack/ceilometer-0" Oct 03 09:08:05 crc kubenswrapper[4810]: I1003 09:08:05.879122 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:06 crc kubenswrapper[4810]: I1003 09:08:06.356388 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:07 crc kubenswrapper[4810]: I1003 09:08:07.037019 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerStarted","Data":"2be7163d0b1221e856298496f4c73a89859544d31babbdd2f1c14c6c04d727d2"} Oct 03 09:08:12 crc kubenswrapper[4810]: I1003 09:08:12.874438 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:12 crc kubenswrapper[4810]: I1003 09:08:12.879881 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:13 crc kubenswrapper[4810]: I1003 09:08:13.114363 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.715939 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.716236 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a0caf621-8402-40a0-90e3-e70455734a82" containerName="openstackclient" containerID="cri-o://15874275b3a2597d983a16f3a24a3d8a1f72f8482b5ea12528bb53e8dbd99e80" gracePeriod=2 Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.734596 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.759647 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 09:08:14 crc kubenswrapper[4810]: E1003 09:08:14.760207 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0caf621-8402-40a0-90e3-e70455734a82" containerName="openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.760221 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0caf621-8402-40a0-90e3-e70455734a82" containerName="openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.760420 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0caf621-8402-40a0-90e3-e70455734a82" containerName="openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.761337 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.772278 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.784381 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a0caf621-8402-40a0-90e3-e70455734a82" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.865715 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.866339 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.866564 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.866876 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxgg\" (UniqueName: \"kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.969527 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.969629 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szxgg\" (UniqueName: \"kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.969804 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.969842 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.970863 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.976731 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.978667 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:14 crc kubenswrapper[4810]: I1003 09:08:14.992329 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxgg\" (UniqueName: \"kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg\") pod \"openstackclient\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " pod="openstack/openstackclient" Oct 03 09:08:15 crc kubenswrapper[4810]: I1003 09:08:15.084046 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:08:15 crc kubenswrapper[4810]: I1003 09:08:15.849994 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 09:08:15 crc kubenswrapper[4810]: W1003 09:08:15.854785 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc783295d_2aa9_46e1_baad_eef94c60dc6f.slice/crio-519fcd3933b5f911c8068fb0f68369c0cd448f34b04fdd0cb6d3e36fe30fdc8b WatchSource:0}: Error finding container 519fcd3933b5f911c8068fb0f68369c0cd448f34b04fdd0cb6d3e36fe30fdc8b: Status 404 returned error can't find the container with id 519fcd3933b5f911c8068fb0f68369c0cd448f34b04fdd0cb6d3e36fe30fdc8b Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.175332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c783295d-2aa9-46e1-baad-eef94c60dc6f","Type":"ContainerStarted","Data":"38f4857d992be3b22e9531eec0f7a8496e516e8d7855074bc7163e25e6d53fa8"} Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.176093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c783295d-2aa9-46e1-baad-eef94c60dc6f","Type":"ContainerStarted","Data":"519fcd3933b5f911c8068fb0f68369c0cd448f34b04fdd0cb6d3e36fe30fdc8b"} Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.177203 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.177466 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="prometheus" containerID="cri-o://822c9c3094ef99ba7dddecfe37fdf63df80d8eb24f66b4f6d0d48efa7af54852" gracePeriod=600 Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.177535 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="thanos-sidecar" containerID="cri-o://f03519b1ec60783954e5cc8ad288cfd5848558733f08b56079a2b7df0d460399" gracePeriod=600 Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.177583 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="config-reloader" containerID="cri-o://40fb33ac90113ee47eec1939ef7f50051da823527b857661001a884f9fefb1d8" gracePeriod=600 Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.182524 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerStarted","Data":"d90cf4224decbb4f71ce96e20f0818572ce4e03e4c1206374fd696928ab618d8"} Oct 03 09:08:16 crc kubenswrapper[4810]: I1003 09:08:16.197005 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.196986635 podStartE2EDuration="2.196986635s" podCreationTimestamp="2025-10-03 09:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:16.195002401 +0000 UTC m=+7929.622253156" watchObservedRunningTime="2025-10-03 09:08:16.196986635 +0000 UTC m=+7929.624237370" Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.208528 4810 generic.go:334] "Generic (PLEG): container finished" podID="349e11b6-cc0d-494f-8a16-e8551416b426" containerID="f03519b1ec60783954e5cc8ad288cfd5848558733f08b56079a2b7df0d460399" exitCode=0 Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.208864 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerDied","Data":"f03519b1ec60783954e5cc8ad288cfd5848558733f08b56079a2b7df0d460399"} Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.208955 4810 generic.go:334] "Generic (PLEG): container finished" podID="349e11b6-cc0d-494f-8a16-e8551416b426" containerID="40fb33ac90113ee47eec1939ef7f50051da823527b857661001a884f9fefb1d8" exitCode=0 Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.208971 4810 generic.go:334] "Generic (PLEG): container finished" podID="349e11b6-cc0d-494f-8a16-e8551416b426" containerID="822c9c3094ef99ba7dddecfe37fdf63df80d8eb24f66b4f6d0d48efa7af54852" exitCode=0 Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.209023 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerDied","Data":"40fb33ac90113ee47eec1939ef7f50051da823527b857661001a884f9fefb1d8"} Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.209040 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerDied","Data":"822c9c3094ef99ba7dddecfe37fdf63df80d8eb24f66b4f6d0d48efa7af54852"} Oct 03 09:08:17 crc kubenswrapper[4810]: I1003 09:08:17.217799 4810 generic.go:334] "Generic (PLEG): container finished" podID="a0caf621-8402-40a0-90e3-e70455734a82" containerID="15874275b3a2597d983a16f3a24a3d8a1f72f8482b5ea12528bb53e8dbd99e80" exitCode=137 Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.041000 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.151644 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsfsr\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.151797 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.151858 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.152076 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.152144 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.152228 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.152259 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.152307 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file\") pod \"349e11b6-cc0d-494f-8a16-e8551416b426\" (UID: \"349e11b6-cc0d-494f-8a16-e8551416b426\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.153373 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.161752 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.162834 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.164225 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr" (OuterVolumeSpecName: "kube-api-access-dsfsr") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "kube-api-access-dsfsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.168176 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config" (OuterVolumeSpecName: "config") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.169404 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out" (OuterVolumeSpecName: "config-out") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.207312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config" (OuterVolumeSpecName: "web-config") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.220397 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "349e11b6-cc0d-494f-8a16-e8551416b426" (UID: "349e11b6-cc0d-494f-8a16-e8551416b426"). InnerVolumeSpecName "pvc-69452521-9c79-47c6-8ea5-998f140a0d2a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256483 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/349e11b6-cc0d-494f-8a16-e8551416b426-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256509 4810 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256524 4810 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256538 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsfsr\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-kube-api-access-dsfsr\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256553 4810 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/349e11b6-cc0d-494f-8a16-e8551416b426-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256563 4810 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/349e11b6-cc0d-494f-8a16-e8551416b426-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256616 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") on node \"crc\" " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.256633 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/349e11b6-cc0d-494f-8a16-e8551416b426-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.263691 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"349e11b6-cc0d-494f-8a16-e8551416b426","Type":"ContainerDied","Data":"672e38624c2298cef4fe926b04d2bce26262247eb0d21c0e2e7389c97b0ea185"} Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.263753 4810 scope.go:117] "RemoveContainer" containerID="f03519b1ec60783954e5cc8ad288cfd5848558733f08b56079a2b7df0d460399" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.263707 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.269718 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3498b199bccc33619d4d243e52a4a293e9d0d6222f6c75db10ae41ae3976a1f7" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.300285 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.301021 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-69452521-9c79-47c6-8ea5-998f140a0d2a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a") on node "crc" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.306271 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:08:18 crc kubenswrapper[4810]: E1003 09:08:18.306721 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.358739 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.369754 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.370360 4810 scope.go:117] "RemoveContainer" containerID="40fb33ac90113ee47eec1939ef7f50051da823527b857661001a884f9fefb1d8" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.383135 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a0caf621-8402-40a0-90e3-e70455734a82" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.389574 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.405955 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.442260 4810 scope.go:117] "RemoveContainer" containerID="822c9c3094ef99ba7dddecfe37fdf63df80d8eb24f66b4f6d0d48efa7af54852" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.472646 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phjpq\" (UniqueName: \"kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq\") pod \"a0caf621-8402-40a0-90e3-e70455734a82\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.472799 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle\") pod \"a0caf621-8402-40a0-90e3-e70455734a82\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.472927 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config\") pod \"a0caf621-8402-40a0-90e3-e70455734a82\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.473002 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret\") pod \"a0caf621-8402-40a0-90e3-e70455734a82\" (UID: \"a0caf621-8402-40a0-90e3-e70455734a82\") " Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.477325 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.477785 4810 scope.go:117] "RemoveContainer" containerID="fb806aa683711d50318108dee25c7aaf9c917a8e2dde745bb07d8f2c6a7ee8d7" Oct 03 09:08:18 crc kubenswrapper[4810]: E1003 09:08:18.479786 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="prometheus" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.479814 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="prometheus" Oct 03 09:08:18 crc kubenswrapper[4810]: E1003 09:08:18.479858 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="config-reloader" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.479864 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="config-reloader" Oct 03 09:08:18 crc kubenswrapper[4810]: E1003 09:08:18.479956 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="thanos-sidecar" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.479964 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="thanos-sidecar" Oct 03 09:08:18 crc kubenswrapper[4810]: E1003 09:08:18.480003 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="init-config-reloader" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.480009 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="init-config-reloader" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.481323 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="prometheus" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.481381 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="config-reloader" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.482137 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq" (OuterVolumeSpecName: "kube-api-access-phjpq") pod "a0caf621-8402-40a0-90e3-e70455734a82" (UID: "a0caf621-8402-40a0-90e3-e70455734a82"). InnerVolumeSpecName "kube-api-access-phjpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.482355 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="thanos-sidecar" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.503399 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.506877 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.509281 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.509530 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.509533 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.510051 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-h9hmj" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.510288 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.511189 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.514462 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0caf621-8402-40a0-90e3-e70455734a82" (UID: "a0caf621-8402-40a0-90e3-e70455734a82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.516209 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.524022 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a0caf621-8402-40a0-90e3-e70455734a82" (UID: "a0caf621-8402-40a0-90e3-e70455734a82"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.558291 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a0caf621-8402-40a0-90e3-e70455734a82" (UID: "a0caf621-8402-40a0-90e3-e70455734a82"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.575595 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576102 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mphm\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576209 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576302 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576402 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576565 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576657 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576765 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.576920 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.577198 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.577418 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.577446 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.577465 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a0caf621-8402-40a0-90e3-e70455734a82-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.577480 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phjpq\" (UniqueName: \"kubernetes.io/projected/a0caf621-8402-40a0-90e3-e70455734a82-kube-api-access-phjpq\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679062 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679516 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679550 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679599 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679621 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mphm\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679645 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679669 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679783 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679808 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.679837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.680591 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.684677 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.686006 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.686005 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.686315 4810 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.687582 4810 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d66854587727a0fa130476e16e5761c8f5e5e94bddb4dd762c3ee28691e2c088/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.687756 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.687188 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.686616 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.688173 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.692840 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.704034 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mphm\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.748725 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"prometheus-metric-storage-0\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:18 crc kubenswrapper[4810]: I1003 09:08:18.867288 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.306174 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerStarted","Data":"8b000eaa2a08993718afb67ce9d0e3542283c37ab2c92a8bcbdc9d1403fee6b1"} Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.310233 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.318360 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a0caf621-8402-40a0-90e3-e70455734a82" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.336145 4810 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a0caf621-8402-40a0-90e3-e70455734a82" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.345315 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" path="/var/lib/kubelet/pods/349e11b6-cc0d-494f-8a16-e8551416b426/volumes" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.346814 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0caf621-8402-40a0-90e3-e70455734a82" path="/var/lib/kubelet/pods/a0caf621-8402-40a0-90e3-e70455734a82/volumes" Oct 03 09:08:19 crc kubenswrapper[4810]: I1003 09:08:19.425825 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:08:19 crc kubenswrapper[4810]: W1003 09:08:19.433028 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99cbdb61_8043_480e_836c_876ea826c5c3.slice/crio-d0093dd8aa49c6d6090c46f25a1f776c62911fbfaafc33c8a5a613f60e613848 WatchSource:0}: Error finding container d0093dd8aa49c6d6090c46f25a1f776c62911fbfaafc33c8a5a613f60e613848: Status 404 returned error can't find the container with id d0093dd8aa49c6d6090c46f25a1f776c62911fbfaafc33c8a5a613f60e613848 Oct 03 09:08:20 crc kubenswrapper[4810]: I1003 09:08:20.336849 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerStarted","Data":"d0093dd8aa49c6d6090c46f25a1f776c62911fbfaafc33c8a5a613f60e613848"} Oct 03 09:08:20 crc kubenswrapper[4810]: I1003 09:08:20.342852 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerStarted","Data":"8ffcfbde2a1c3a459e0ce66bccb7fa28b98e55b2c8d21d9d8f540085375a38e4"} Oct 03 09:08:20 crc kubenswrapper[4810]: I1003 09:08:20.875146 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="349e11b6-cc0d-494f-8a16-e8551416b426" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.143:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.376264 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerStarted","Data":"2840747b51478463d6c5afc58d83f7b9ef58783f4b5bb105804f83bb4c64fca8"} Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.377868 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.377989 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerStarted","Data":"f13a52947ceffedfa962780acfef9809f64ca3164e93bfaa9673bc9fcd4a3572"} Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.398633 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.219782141 podStartE2EDuration="18.398616931s" podCreationTimestamp="2025-10-03 09:08:05 +0000 UTC" firstStartedPulling="2025-10-03 09:08:06.367653867 +0000 UTC m=+7919.794904602" lastFinishedPulling="2025-10-03 09:08:22.546488637 +0000 UTC m=+7935.973739392" observedRunningTime="2025-10-03 09:08:23.394327977 +0000 UTC m=+7936.821578712" watchObservedRunningTime="2025-10-03 09:08:23.398616931 +0000 UTC m=+7936.825867666" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.662274 4810 scope.go:117] "RemoveContainer" containerID="bd3d09032be293ee570439dc544a171fa80769c7a04e80db8ddeac767dc95a48" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.685519 4810 scope.go:117] "RemoveContainer" containerID="8c36e2e8cd4f021ac4fec75baa0abcd4e71eebb83a3be8eed45f91852ac6d3b0" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.717671 4810 scope.go:117] "RemoveContainer" containerID="f46648c087159842a6b62f0c6ac93c98cf576d92e565125999b92e17084b4735" Oct 03 09:08:23 crc kubenswrapper[4810]: I1003 09:08:23.763808 4810 scope.go:117] "RemoveContainer" containerID="b0d216d372f54f7525fe772f1bffc384d325ecc2daa15973b2e367b10c0924e3" Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.301453 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-4kpwj"] Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.306156 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.315440 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4kpwj"] Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.417703 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcqt6\" (UniqueName: \"kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6\") pod \"aodh-db-create-4kpwj\" (UID: \"4843e422-4c1a-41ed-9626-2593e9f064fd\") " pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.520156 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcqt6\" (UniqueName: \"kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6\") pod \"aodh-db-create-4kpwj\" (UID: \"4843e422-4c1a-41ed-9626-2593e9f064fd\") " pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.547582 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcqt6\" (UniqueName: \"kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6\") pod \"aodh-db-create-4kpwj\" (UID: \"4843e422-4c1a-41ed-9626-2593e9f064fd\") " pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:28 crc kubenswrapper[4810]: I1003 09:08:28.634737 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:29 crc kubenswrapper[4810]: I1003 09:08:29.125985 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-4kpwj"] Oct 03 09:08:29 crc kubenswrapper[4810]: I1003 09:08:29.445859 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4kpwj" event={"ID":"4843e422-4c1a-41ed-9626-2593e9f064fd","Type":"ContainerStarted","Data":"adb7a5c808b93d6656ba7be4b3adb1cef9ad5a4f1a8d95fd455fb9be3ec69058"} Oct 03 09:08:29 crc kubenswrapper[4810]: I1003 09:08:29.445924 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4kpwj" event={"ID":"4843e422-4c1a-41ed-9626-2593e9f064fd","Type":"ContainerStarted","Data":"e2cb1dedadc14844413bba98a29b3b084fd0e3736ad988e50ad3e9076fddf79b"} Oct 03 09:08:29 crc kubenswrapper[4810]: I1003 09:08:29.461258 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-4kpwj" podStartSLOduration=1.461235439 podStartE2EDuration="1.461235439s" podCreationTimestamp="2025-10-03 09:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:29.459598795 +0000 UTC m=+7942.886849530" watchObservedRunningTime="2025-10-03 09:08:29.461235439 +0000 UTC m=+7942.888486184" Oct 03 09:08:30 crc kubenswrapper[4810]: I1003 09:08:30.458122 4810 generic.go:334] "Generic (PLEG): container finished" podID="4843e422-4c1a-41ed-9626-2593e9f064fd" containerID="adb7a5c808b93d6656ba7be4b3adb1cef9ad5a4f1a8d95fd455fb9be3ec69058" exitCode=0 Oct 03 09:08:30 crc kubenswrapper[4810]: I1003 09:08:30.458194 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4kpwj" event={"ID":"4843e422-4c1a-41ed-9626-2593e9f064fd","Type":"ContainerDied","Data":"adb7a5c808b93d6656ba7be4b3adb1cef9ad5a4f1a8d95fd455fb9be3ec69058"} Oct 03 09:08:31 crc kubenswrapper[4810]: I1003 09:08:31.302966 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:08:31 crc kubenswrapper[4810]: E1003 09:08:31.303556 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:08:31 crc kubenswrapper[4810]: I1003 09:08:31.932215 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.101352 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcqt6\" (UniqueName: \"kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6\") pod \"4843e422-4c1a-41ed-9626-2593e9f064fd\" (UID: \"4843e422-4c1a-41ed-9626-2593e9f064fd\") " Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.106724 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6" (OuterVolumeSpecName: "kube-api-access-dcqt6") pod "4843e422-4c1a-41ed-9626-2593e9f064fd" (UID: "4843e422-4c1a-41ed-9626-2593e9f064fd"). InnerVolumeSpecName "kube-api-access-dcqt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.204568 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcqt6\" (UniqueName: \"kubernetes.io/projected/4843e422-4c1a-41ed-9626-2593e9f064fd-kube-api-access-dcqt6\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.480481 4810 generic.go:334] "Generic (PLEG): container finished" podID="99cbdb61-8043-480e-836c-876ea826c5c3" containerID="f13a52947ceffedfa962780acfef9809f64ca3164e93bfaa9673bc9fcd4a3572" exitCode=0 Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.480598 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerDied","Data":"f13a52947ceffedfa962780acfef9809f64ca3164e93bfaa9673bc9fcd4a3572"} Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.487660 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-4kpwj" event={"ID":"4843e422-4c1a-41ed-9626-2593e9f064fd","Type":"ContainerDied","Data":"e2cb1dedadc14844413bba98a29b3b084fd0e3736ad988e50ad3e9076fddf79b"} Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.487694 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2cb1dedadc14844413bba98a29b3b084fd0e3736ad988e50ad3e9076fddf79b" Oct 03 09:08:32 crc kubenswrapper[4810]: I1003 09:08:32.497101 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-4kpwj" Oct 03 09:08:33 crc kubenswrapper[4810]: I1003 09:08:33.502261 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerStarted","Data":"695d874952bb7697f4f41bab1649fbdd13c3b210395aab381ef29bcdad672912"} Oct 03 09:08:36 crc kubenswrapper[4810]: I1003 09:08:36.163915 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 09:08:36 crc kubenswrapper[4810]: I1003 09:08:36.529849 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerStarted","Data":"bfb8c46462d42f7cabbdb9785c5516092ad0178d1e18d3fc0e4dc555e3aec222"} Oct 03 09:08:37 crc kubenswrapper[4810]: I1003 09:08:37.545514 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerStarted","Data":"a140b7e92a1d9149e2d71e898a7a3c0826dc5bdb44363234e8c1a19baa36023f"} Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.454762 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=20.454735688 podStartE2EDuration="20.454735688s" podCreationTimestamp="2025-10-03 09:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:08:37.575642882 +0000 UTC m=+7951.002893637" watchObservedRunningTime="2025-10-03 09:08:38.454735688 +0000 UTC m=+7951.881986423" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.458135 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-a3b8-account-create-qm2vt"] Oct 03 09:08:38 crc kubenswrapper[4810]: E1003 09:08:38.462823 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4843e422-4c1a-41ed-9626-2593e9f064fd" containerName="mariadb-database-create" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.462861 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4843e422-4c1a-41ed-9626-2593e9f064fd" containerName="mariadb-database-create" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.466371 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4843e422-4c1a-41ed-9626-2593e9f064fd" containerName="mariadb-database-create" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.467585 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.469455 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.475808 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a3b8-account-create-qm2vt"] Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.648365 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44n4\" (UniqueName: \"kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4\") pod \"aodh-a3b8-account-create-qm2vt\" (UID: \"138a7b81-cd2a-4106-8743-b1520c597d0b\") " pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.751190 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44n4\" (UniqueName: \"kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4\") pod \"aodh-a3b8-account-create-qm2vt\" (UID: \"138a7b81-cd2a-4106-8743-b1520c597d0b\") " pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.770980 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44n4\" (UniqueName: \"kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4\") pod \"aodh-a3b8-account-create-qm2vt\" (UID: \"138a7b81-cd2a-4106-8743-b1520c597d0b\") " pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.803488 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:38 crc kubenswrapper[4810]: I1003 09:08:38.868501 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:39 crc kubenswrapper[4810]: I1003 09:08:39.356298 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-a3b8-account-create-qm2vt"] Oct 03 09:08:39 crc kubenswrapper[4810]: I1003 09:08:39.569538 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a3b8-account-create-qm2vt" event={"ID":"138a7b81-cd2a-4106-8743-b1520c597d0b","Type":"ContainerStarted","Data":"2704d41a752a4a7248480febcccf21c373196cdd72831126848905ca0dfe70c4"} Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.263294 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.263882 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a2973c82-79f4-492a-b9c8-414dae53b759" containerName="kube-state-metrics" containerID="cri-o://a09e0db5b2095a446049e0ff9772dc37b61775dbc2bf1b492b79fdf166addfae" gracePeriod=30 Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.582550 4810 generic.go:334] "Generic (PLEG): container finished" podID="138a7b81-cd2a-4106-8743-b1520c597d0b" containerID="a72f3af1a00931516d7f54475ffe568d59818e118dc990168c338c158fa9a828" exitCode=0 Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.582642 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a3b8-account-create-qm2vt" event={"ID":"138a7b81-cd2a-4106-8743-b1520c597d0b","Type":"ContainerDied","Data":"a72f3af1a00931516d7f54475ffe568d59818e118dc990168c338c158fa9a828"} Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.586285 4810 generic.go:334] "Generic (PLEG): container finished" podID="a2973c82-79f4-492a-b9c8-414dae53b759" containerID="a09e0db5b2095a446049e0ff9772dc37b61775dbc2bf1b492b79fdf166addfae" exitCode=2 Oct 03 09:08:40 crc kubenswrapper[4810]: I1003 09:08:40.586349 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a2973c82-79f4-492a-b9c8-414dae53b759","Type":"ContainerDied","Data":"a09e0db5b2095a446049e0ff9772dc37b61775dbc2bf1b492b79fdf166addfae"} Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.520919 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.606943 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a2973c82-79f4-492a-b9c8-414dae53b759","Type":"ContainerDied","Data":"7173819b9a920c2eaba4795403e6e279b33f89bc56c11cf9b83a167ec57fcc79"} Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.607009 4810 scope.go:117] "RemoveContainer" containerID="a09e0db5b2095a446049e0ff9772dc37b61775dbc2bf1b492b79fdf166addfae" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.606963 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.676975 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t8r4\" (UniqueName: \"kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4\") pod \"a2973c82-79f4-492a-b9c8-414dae53b759\" (UID: \"a2973c82-79f4-492a-b9c8-414dae53b759\") " Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.694522 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4" (OuterVolumeSpecName: "kube-api-access-5t8r4") pod "a2973c82-79f4-492a-b9c8-414dae53b759" (UID: "a2973c82-79f4-492a-b9c8-414dae53b759"). InnerVolumeSpecName "kube-api-access-5t8r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.781058 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t8r4\" (UniqueName: \"kubernetes.io/projected/a2973c82-79f4-492a-b9c8-414dae53b759-kube-api-access-5t8r4\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.950594 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.971413 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.985351 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:41 crc kubenswrapper[4810]: E1003 09:08:41.985842 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2973c82-79f4-492a-b9c8-414dae53b759" containerName="kube-state-metrics" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.985865 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2973c82-79f4-492a-b9c8-414dae53b759" containerName="kube-state-metrics" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.989822 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2973c82-79f4-492a-b9c8-414dae53b759" containerName="kube-state-metrics" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.990966 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.995445 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.998372 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 09:08:41 crc kubenswrapper[4810]: I1003 09:08:41.999225 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.169482 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.197023 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.197316 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f26jm\" (UniqueName: \"kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.197358 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.197440 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.307922 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s44n4\" (UniqueName: \"kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4\") pod \"138a7b81-cd2a-4106-8743-b1520c597d0b\" (UID: \"138a7b81-cd2a-4106-8743-b1520c597d0b\") " Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.308605 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.308646 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f26jm\" (UniqueName: \"kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.308697 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.308955 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.313158 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4" (OuterVolumeSpecName: "kube-api-access-s44n4") pod "138a7b81-cd2a-4106-8743-b1520c597d0b" (UID: "138a7b81-cd2a-4106-8743-b1520c597d0b"). InnerVolumeSpecName "kube-api-access-s44n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.316361 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.319978 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.329680 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.344179 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f26jm\" (UniqueName: \"kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm\") pod \"kube-state-metrics-0\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.412133 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s44n4\" (UniqueName: \"kubernetes.io/projected/138a7b81-cd2a-4106-8743-b1520c597d0b-kube-api-access-s44n4\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.456684 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.574199 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.574529 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-central-agent" containerID="cri-o://d90cf4224decbb4f71ce96e20f0818572ce4e03e4c1206374fd696928ab618d8" gracePeriod=30 Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.574570 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="proxy-httpd" containerID="cri-o://2840747b51478463d6c5afc58d83f7b9ef58783f4b5bb105804f83bb4c64fca8" gracePeriod=30 Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.574591 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="sg-core" containerID="cri-o://8ffcfbde2a1c3a459e0ce66bccb7fa28b98e55b2c8d21d9d8f540085375a38e4" gracePeriod=30 Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.574601 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-notification-agent" containerID="cri-o://8b000eaa2a08993718afb67ce9d0e3542283c37ab2c92a8bcbdc9d1403fee6b1" gracePeriod=30 Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.624107 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-a3b8-account-create-qm2vt" event={"ID":"138a7b81-cd2a-4106-8743-b1520c597d0b","Type":"ContainerDied","Data":"2704d41a752a4a7248480febcccf21c373196cdd72831126848905ca0dfe70c4"} Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.624428 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2704d41a752a4a7248480febcccf21c373196cdd72831126848905ca0dfe70c4" Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.624477 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-a3b8-account-create-qm2vt" Oct 03 09:08:42 crc kubenswrapper[4810]: W1003 09:08:42.970041 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64def3d7_6342_432a_a0c2_b562b7514bca.slice/crio-04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530 WatchSource:0}: Error finding container 04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530: Status 404 returned error can't find the container with id 04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530 Oct 03 09:08:42 crc kubenswrapper[4810]: I1003 09:08:42.978736 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.324080 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2973c82-79f4-492a-b9c8-414dae53b759" path="/var/lib/kubelet/pods/a2973c82-79f4-492a-b9c8-414dae53b759/volumes" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.636535 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64def3d7-6342-432a-a0c2-b562b7514bca","Type":"ContainerStarted","Data":"04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530"} Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.642966 4810 generic.go:334] "Generic (PLEG): container finished" podID="1941d65d-2868-461a-995d-4a931fdea880" containerID="2840747b51478463d6c5afc58d83f7b9ef58783f4b5bb105804f83bb4c64fca8" exitCode=0 Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.643222 4810 generic.go:334] "Generic (PLEG): container finished" podID="1941d65d-2868-461a-995d-4a931fdea880" containerID="8ffcfbde2a1c3a459e0ce66bccb7fa28b98e55b2c8d21d9d8f540085375a38e4" exitCode=2 Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.643306 4810 generic.go:334] "Generic (PLEG): container finished" podID="1941d65d-2868-461a-995d-4a931fdea880" containerID="d90cf4224decbb4f71ce96e20f0818572ce4e03e4c1206374fd696928ab618d8" exitCode=0 Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.643047 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerDied","Data":"2840747b51478463d6c5afc58d83f7b9ef58783f4b5bb105804f83bb4c64fca8"} Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.643472 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerDied","Data":"8ffcfbde2a1c3a459e0ce66bccb7fa28b98e55b2c8d21d9d8f540085375a38e4"} Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.643551 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerDied","Data":"d90cf4224decbb4f71ce96e20f0818572ce4e03e4c1206374fd696928ab618d8"} Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.852029 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-mh5pr"] Oct 03 09:08:43 crc kubenswrapper[4810]: E1003 09:08:43.853016 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138a7b81-cd2a-4106-8743-b1520c597d0b" containerName="mariadb-account-create" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.853042 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="138a7b81-cd2a-4106-8743-b1520c597d0b" containerName="mariadb-account-create" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.853446 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="138a7b81-cd2a-4106-8743-b1520c597d0b" containerName="mariadb-account-create" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.860096 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.866618 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.866925 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2pdj4" Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.866963 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-mh5pr"] Oct 03 09:08:43 crc kubenswrapper[4810]: I1003 09:08:43.867079 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.056386 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.056463 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.056517 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.056540 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsxbl\" (UniqueName: \"kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.158660 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.158738 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.158808 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.158842 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsxbl\" (UniqueName: \"kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.173419 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.173460 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.174217 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.198075 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsxbl\" (UniqueName: \"kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl\") pod \"aodh-db-sync-mh5pr\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.304829 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:08:44 crc kubenswrapper[4810]: E1003 09:08:44.305091 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.491338 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.676334 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64def3d7-6342-432a-a0c2-b562b7514bca","Type":"ContainerStarted","Data":"dbab8a7e512236638d4064d4a8006a3f143bbc0cfea49417d843d0140eb8cafb"} Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.686090 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 09:08:44 crc kubenswrapper[4810]: I1003 09:08:44.722903 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.338231219 podStartE2EDuration="3.722873745s" podCreationTimestamp="2025-10-03 09:08:41 +0000 UTC" firstStartedPulling="2025-10-03 09:08:42.973283714 +0000 UTC m=+7956.400534449" lastFinishedPulling="2025-10-03 09:08:43.35792623 +0000 UTC m=+7956.785176975" observedRunningTime="2025-10-03 09:08:44.70356023 +0000 UTC m=+7958.130810965" watchObservedRunningTime="2025-10-03 09:08:44.722873745 +0000 UTC m=+7958.150124470" Oct 03 09:08:45 crc kubenswrapper[4810]: I1003 09:08:45.036073 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-mh5pr"] Oct 03 09:08:45 crc kubenswrapper[4810]: I1003 09:08:45.709169 4810 generic.go:334] "Generic (PLEG): container finished" podID="1941d65d-2868-461a-995d-4a931fdea880" containerID="8b000eaa2a08993718afb67ce9d0e3542283c37ab2c92a8bcbdc9d1403fee6b1" exitCode=0 Oct 03 09:08:45 crc kubenswrapper[4810]: I1003 09:08:45.709273 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerDied","Data":"8b000eaa2a08993718afb67ce9d0e3542283c37ab2c92a8bcbdc9d1403fee6b1"} Oct 03 09:08:45 crc kubenswrapper[4810]: I1003 09:08:45.713714 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-mh5pr" event={"ID":"65d99cf4-6580-44c8-b7c5-807251deaf31","Type":"ContainerStarted","Data":"87e4a43cadaca28489dea35cca318b932900a0a76602c586bbd7595924e918f5"} Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.149588 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317408 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317547 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317652 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317680 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8b4f\" (UniqueName: \"kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317760 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.317802 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.318050 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts\") pod \"1941d65d-2868-461a-995d-4a931fdea880\" (UID: \"1941d65d-2868-461a-995d-4a931fdea880\") " Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.318242 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.318543 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.319789 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.319860 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1941d65d-2868-461a-995d-4a931fdea880-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.324090 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts" (OuterVolumeSpecName: "scripts") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.324313 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f" (OuterVolumeSpecName: "kube-api-access-b8b4f") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "kube-api-access-b8b4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.360713 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.416764 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.422435 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.422466 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8b4f\" (UniqueName: \"kubernetes.io/projected/1941d65d-2868-461a-995d-4a931fdea880-kube-api-access-b8b4f\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.422477 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.422487 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.471272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data" (OuterVolumeSpecName: "config-data") pod "1941d65d-2868-461a-995d-4a931fdea880" (UID: "1941d65d-2868-461a-995d-4a931fdea880"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.525072 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1941d65d-2868-461a-995d-4a931fdea880-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.725963 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1941d65d-2868-461a-995d-4a931fdea880","Type":"ContainerDied","Data":"2be7163d0b1221e856298496f4c73a89859544d31babbdd2f1c14c6c04d727d2"} Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.726034 4810 scope.go:117] "RemoveContainer" containerID="2840747b51478463d6c5afc58d83f7b9ef58783f4b5bb105804f83bb4c64fca8" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.726045 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.764113 4810 scope.go:117] "RemoveContainer" containerID="8ffcfbde2a1c3a459e0ce66bccb7fa28b98e55b2c8d21d9d8f540085375a38e4" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.766560 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.785629 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.805947 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:46 crc kubenswrapper[4810]: E1003 09:08:46.807261 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-central-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807312 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-central-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: E1003 09:08:46.807366 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="sg-core" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807375 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="sg-core" Oct 03 09:08:46 crc kubenswrapper[4810]: E1003 09:08:46.807413 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-notification-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807422 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-notification-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: E1003 09:08:46.807434 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="proxy-httpd" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807442 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="proxy-httpd" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807829 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="proxy-httpd" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807867 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="sg-core" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807886 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-central-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.807922 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1941d65d-2868-461a-995d-4a931fdea880" containerName="ceilometer-notification-agent" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.810312 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.813204 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.814433 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.817056 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.824722 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.829804 4810 scope.go:117] "RemoveContainer" containerID="8b000eaa2a08993718afb67ce9d0e3542283c37ab2c92a8bcbdc9d1403fee6b1" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.868362 4810 scope.go:117] "RemoveContainer" containerID="d90cf4224decbb4f71ce96e20f0818572ce4e03e4c1206374fd696928ab618d8" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930658 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930720 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930751 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930786 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930821 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930860 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930908 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:46 crc kubenswrapper[4810]: I1003 09:08:46.930949 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96lml\" (UniqueName: \"kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.043683 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96lml\" (UniqueName: \"kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.043773 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.043861 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.043931 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.044008 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.044080 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.044158 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.044235 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.085662 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.121704 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.122886 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.123028 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.123413 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.123448 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.123752 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96lml\" (UniqueName: \"kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.125859 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.134703 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.338466 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1941d65d-2868-461a-995d-4a931fdea880" path="/var/lib/kubelet/pods/1941d65d-2868-461a-995d-4a931fdea880/volumes" Oct 03 09:08:47 crc kubenswrapper[4810]: I1003 09:08:47.688098 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:08:48 crc kubenswrapper[4810]: I1003 09:08:48.868880 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:48 crc kubenswrapper[4810]: I1003 09:08:48.875077 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:49 crc kubenswrapper[4810]: I1003 09:08:49.767037 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 09:08:51 crc kubenswrapper[4810]: I1003 09:08:51.787026 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerStarted","Data":"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee"} Oct 03 09:08:51 crc kubenswrapper[4810]: I1003 09:08:51.787682 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerStarted","Data":"32ebedb94bfbf1ba4d58acc57be97e939068472c72c5ac4ec0567bb9546e2f25"} Oct 03 09:08:51 crc kubenswrapper[4810]: I1003 09:08:51.790290 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-mh5pr" event={"ID":"65d99cf4-6580-44c8-b7c5-807251deaf31","Type":"ContainerStarted","Data":"6f8d1a8773316235134643735f64759cbb71cf6070e1e1be8f28f85cc3728356"} Oct 03 09:08:51 crc kubenswrapper[4810]: I1003 09:08:51.810139 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-mh5pr" podStartSLOduration=2.6842708699999998 podStartE2EDuration="8.810114537s" podCreationTimestamp="2025-10-03 09:08:43 +0000 UTC" firstStartedPulling="2025-10-03 09:08:45.04158021 +0000 UTC m=+7958.468830945" lastFinishedPulling="2025-10-03 09:08:51.167423877 +0000 UTC m=+7964.594674612" observedRunningTime="2025-10-03 09:08:51.808832112 +0000 UTC m=+7965.236082847" watchObservedRunningTime="2025-10-03 09:08:51.810114537 +0000 UTC m=+7965.237365272" Oct 03 09:08:52 crc kubenswrapper[4810]: I1003 09:08:52.472847 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 09:08:52 crc kubenswrapper[4810]: I1003 09:08:52.801104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerStarted","Data":"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db"} Oct 03 09:08:54 crc kubenswrapper[4810]: I1003 09:08:54.831650 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerStarted","Data":"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6"} Oct 03 09:08:55 crc kubenswrapper[4810]: I1003 09:08:55.302302 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:08:55 crc kubenswrapper[4810]: E1003 09:08:55.302799 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:08:56 crc kubenswrapper[4810]: I1003 09:08:56.851340 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerStarted","Data":"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850"} Oct 03 09:08:56 crc kubenswrapper[4810]: I1003 09:08:56.852053 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:08:56 crc kubenswrapper[4810]: I1003 09:08:56.853431 4810 generic.go:334] "Generic (PLEG): container finished" podID="65d99cf4-6580-44c8-b7c5-807251deaf31" containerID="6f8d1a8773316235134643735f64759cbb71cf6070e1e1be8f28f85cc3728356" exitCode=0 Oct 03 09:08:56 crc kubenswrapper[4810]: I1003 09:08:56.853471 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-mh5pr" event={"ID":"65d99cf4-6580-44c8-b7c5-807251deaf31","Type":"ContainerDied","Data":"6f8d1a8773316235134643735f64759cbb71cf6070e1e1be8f28f85cc3728356"} Oct 03 09:08:56 crc kubenswrapper[4810]: I1003 09:08:56.872300 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.70244039 podStartE2EDuration="10.872280897s" podCreationTimestamp="2025-10-03 09:08:46 +0000 UTC" firstStartedPulling="2025-10-03 09:08:50.91714051 +0000 UTC m=+7964.344391245" lastFinishedPulling="2025-10-03 09:08:56.086981017 +0000 UTC m=+7969.514231752" observedRunningTime="2025-10-03 09:08:56.871466765 +0000 UTC m=+7970.298717500" watchObservedRunningTime="2025-10-03 09:08:56.872280897 +0000 UTC m=+7970.299531622" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.479941 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.629070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle\") pod \"65d99cf4-6580-44c8-b7c5-807251deaf31\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.629177 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts\") pod \"65d99cf4-6580-44c8-b7c5-807251deaf31\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.629517 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsxbl\" (UniqueName: \"kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl\") pod \"65d99cf4-6580-44c8-b7c5-807251deaf31\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.629580 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data\") pod \"65d99cf4-6580-44c8-b7c5-807251deaf31\" (UID: \"65d99cf4-6580-44c8-b7c5-807251deaf31\") " Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.636341 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl" (OuterVolumeSpecName: "kube-api-access-gsxbl") pod "65d99cf4-6580-44c8-b7c5-807251deaf31" (UID: "65d99cf4-6580-44c8-b7c5-807251deaf31"). InnerVolumeSpecName "kube-api-access-gsxbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.636484 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts" (OuterVolumeSpecName: "scripts") pod "65d99cf4-6580-44c8-b7c5-807251deaf31" (UID: "65d99cf4-6580-44c8-b7c5-807251deaf31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.664277 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data" (OuterVolumeSpecName: "config-data") pod "65d99cf4-6580-44c8-b7c5-807251deaf31" (UID: "65d99cf4-6580-44c8-b7c5-807251deaf31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.670859 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65d99cf4-6580-44c8-b7c5-807251deaf31" (UID: "65d99cf4-6580-44c8-b7c5-807251deaf31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.738120 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.738171 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.738193 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65d99cf4-6580-44c8-b7c5-807251deaf31-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.738210 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsxbl\" (UniqueName: \"kubernetes.io/projected/65d99cf4-6580-44c8-b7c5-807251deaf31-kube-api-access-gsxbl\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.875109 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-mh5pr" event={"ID":"65d99cf4-6580-44c8-b7c5-807251deaf31","Type":"ContainerDied","Data":"87e4a43cadaca28489dea35cca318b932900a0a76602c586bbd7595924e918f5"} Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.875148 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e4a43cadaca28489dea35cca318b932900a0a76602c586bbd7595924e918f5" Oct 03 09:08:58 crc kubenswrapper[4810]: I1003 09:08:58.875222 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-mh5pr" Oct 03 09:09:02 crc kubenswrapper[4810]: I1003 09:09:02.057654 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-mrv6g"] Oct 03 09:09:02 crc kubenswrapper[4810]: I1003 09:09:02.069681 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-mrv6g"] Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.317414 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="161e5d26-127c-4659-a4a8-8d44c19f4c6d" path="/var/lib/kubelet/pods/161e5d26-127c-4659-a4a8-8d44c19f4c6d/volumes" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.392886 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:03 crc kubenswrapper[4810]: E1003 09:09:03.393368 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65d99cf4-6580-44c8-b7c5-807251deaf31" containerName="aodh-db-sync" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.393380 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="65d99cf4-6580-44c8-b7c5-807251deaf31" containerName="aodh-db-sync" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.393581 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="65d99cf4-6580-44c8-b7c5-807251deaf31" containerName="aodh-db-sync" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.395748 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.399699 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.399748 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2pdj4" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.400043 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.427974 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.544427 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cjr2\" (UniqueName: \"kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.544489 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.544520 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.544541 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.646408 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cjr2\" (UniqueName: \"kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.646485 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.646522 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.646548 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.653947 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.658681 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.661636 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.664538 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cjr2\" (UniqueName: \"kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2\") pod \"aodh-0\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " pod="openstack/aodh-0" Oct 03 09:09:03 crc kubenswrapper[4810]: I1003 09:09:03.736792 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:04 crc kubenswrapper[4810]: I1003 09:09:04.255800 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:04 crc kubenswrapper[4810]: W1003 09:09:04.282133 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f245998_6bd9_4933_ae52_3225dd7a5033.slice/crio-aead3a318e51fc03138ca4f55980e4b16838a5974d8b9b5208ca8798e602345a WatchSource:0}: Error finding container aead3a318e51fc03138ca4f55980e4b16838a5974d8b9b5208ca8798e602345a: Status 404 returned error can't find the container with id aead3a318e51fc03138ca4f55980e4b16838a5974d8b9b5208ca8798e602345a Oct 03 09:09:04 crc kubenswrapper[4810]: I1003 09:09:04.938746 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerStarted","Data":"fde9c096c494a54e3285ab4d7179b549419aa3a1503e807b34f59889a8727047"} Oct 03 09:09:04 crc kubenswrapper[4810]: I1003 09:09:04.939132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerStarted","Data":"aead3a318e51fc03138ca4f55980e4b16838a5974d8b9b5208ca8798e602345a"} Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.555435 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.556262 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="proxy-httpd" containerID="cri-o://87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850" gracePeriod=30 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.556296 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-notification-agent" containerID="cri-o://190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db" gracePeriod=30 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.556323 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="sg-core" containerID="cri-o://e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6" gracePeriod=30 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.556257 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-central-agent" containerID="cri-o://d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee" gracePeriod=30 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.963614 4810 generic.go:334] "Generic (PLEG): container finished" podID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerID="87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850" exitCode=0 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.964110 4810 generic.go:334] "Generic (PLEG): container finished" podID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerID="e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6" exitCode=2 Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.963826 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerDied","Data":"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850"} Oct 03 09:09:05 crc kubenswrapper[4810]: I1003 09:09:05.964164 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerDied","Data":"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6"} Oct 03 09:09:06 crc kubenswrapper[4810]: I1003 09:09:06.987856 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerStarted","Data":"669345b95730081e04f1d6ce608f363ba935d66cda7b365d8047607a8f4996b7"} Oct 03 09:09:06 crc kubenswrapper[4810]: I1003 09:09:06.992818 4810 generic.go:334] "Generic (PLEG): container finished" podID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerID="d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee" exitCode=0 Oct 03 09:09:06 crc kubenswrapper[4810]: I1003 09:09:06.992869 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerDied","Data":"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee"} Oct 03 09:09:07 crc kubenswrapper[4810]: I1003 09:09:07.085619 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:08 crc kubenswrapper[4810]: I1003 09:09:08.007693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerStarted","Data":"174509e4219e0381a84f0372d35f36735d50fdcec74ef46e9f0bf98e4cba9271"} Oct 03 09:09:08 crc kubenswrapper[4810]: I1003 09:09:08.302275 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:09:08 crc kubenswrapper[4810]: E1003 09:09:08.302690 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.011623 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.031306 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerStarted","Data":"66b96aac269e2b46cf7d98dce429f7534f686672c36e2bb0b2a6b76e17dc71e2"} Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.031447 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-api" containerID="cri-o://fde9c096c494a54e3285ab4d7179b549419aa3a1503e807b34f59889a8727047" gracePeriod=30 Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.031514 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-notifier" containerID="cri-o://174509e4219e0381a84f0372d35f36735d50fdcec74ef46e9f0bf98e4cba9271" gracePeriod=30 Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.031542 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-evaluator" containerID="cri-o://669345b95730081e04f1d6ce608f363ba935d66cda7b365d8047607a8f4996b7" gracePeriod=30 Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.031619 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-listener" containerID="cri-o://66b96aac269e2b46cf7d98dce429f7534f686672c36e2bb0b2a6b76e17dc71e2" gracePeriod=30 Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.038465 4810 generic.go:334] "Generic (PLEG): container finished" podID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerID="190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db" exitCode=0 Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.038518 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerDied","Data":"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db"} Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.038552 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2c12046c-57b2-4c4e-8cac-bbc7808efaa6","Type":"ContainerDied","Data":"32ebedb94bfbf1ba4d58acc57be97e939068472c72c5ac4ec0567bb9546e2f25"} Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.038574 4810 scope.go:117] "RemoveContainer" containerID="87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.038752 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.093417 4810 scope.go:117] "RemoveContainer" containerID="e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115515 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115612 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115698 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115798 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96lml\" (UniqueName: \"kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115946 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.115979 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.116019 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.116070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs\") pod \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\" (UID: \"2c12046c-57b2-4c4e-8cac-bbc7808efaa6\") " Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.119686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.125258 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml" (OuterVolumeSpecName: "kube-api-access-96lml") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "kube-api-access-96lml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.128053 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.101952888 podStartE2EDuration="7.128028154s" podCreationTimestamp="2025-10-03 09:09:03 +0000 UTC" firstStartedPulling="2025-10-03 09:09:04.286802571 +0000 UTC m=+7977.714053306" lastFinishedPulling="2025-10-03 09:09:09.312877837 +0000 UTC m=+7982.740128572" observedRunningTime="2025-10-03 09:09:10.093759348 +0000 UTC m=+7983.521010103" watchObservedRunningTime="2025-10-03 09:09:10.128028154 +0000 UTC m=+7983.555278889" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.138702 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.142075 4810 scope.go:117] "RemoveContainer" containerID="190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.173098 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts" (OuterVolumeSpecName: "scripts") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.188275 4810 scope.go:117] "RemoveContainer" containerID="d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.194745 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.219617 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.219650 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.219659 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.219668 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.219677 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96lml\" (UniqueName: \"kubernetes.io/projected/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-kube-api-access-96lml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.225134 4810 scope.go:117] "RemoveContainer" containerID="87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.225638 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850\": container with ID starting with 87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850 not found: ID does not exist" containerID="87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.225695 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850"} err="failed to get container status \"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850\": rpc error: code = NotFound desc = could not find container \"87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850\": container with ID starting with 87e6a0aeb455241152ae3b19158f03020542f29d1719d24b1ae3a66f3559f850 not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.225728 4810 scope.go:117] "RemoveContainer" containerID="e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.226599 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6\": container with ID starting with e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6 not found: ID does not exist" containerID="e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.226653 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6"} err="failed to get container status \"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6\": rpc error: code = NotFound desc = could not find container \"e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6\": container with ID starting with e3e2bdd4a10dae27174273b58e8596553629cd041533ba9a35de7d7475f0bcd6 not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.226675 4810 scope.go:117] "RemoveContainer" containerID="190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.226984 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db\": container with ID starting with 190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db not found: ID does not exist" containerID="190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.227020 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db"} err="failed to get container status \"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db\": rpc error: code = NotFound desc = could not find container \"190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db\": container with ID starting with 190fdfbcdc97b68cc8b753fa886bdacb0a97ec5fac8a11dea898a9685d1557db not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.227043 4810 scope.go:117] "RemoveContainer" containerID="d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.228161 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee\": container with ID starting with d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee not found: ID does not exist" containerID="d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.228191 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee"} err="failed to get container status \"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee\": rpc error: code = NotFound desc = could not find container \"d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee\": container with ID starting with d4b7560c8f0a2d86812f6c1027f359cfdc9382a75c32d7ae95e99846bbb29aee not found: ID does not exist" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.234075 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.273010 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.321497 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.321542 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.362081 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data" (OuterVolumeSpecName: "config-data") pod "2c12046c-57b2-4c4e-8cac-bbc7808efaa6" (UID: "2c12046c-57b2-4c4e-8cac-bbc7808efaa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.423996 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c12046c-57b2-4c4e-8cac-bbc7808efaa6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.676792 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.687122 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.717032 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.717767 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="proxy-httpd" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.717793 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="proxy-httpd" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.717821 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-notification-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.717829 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-notification-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.717867 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-central-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.717877 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-central-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.717916 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="sg-core" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.717926 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="sg-core" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.718223 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="proxy-httpd" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.718257 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="sg-core" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.718279 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-notification-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.718300 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" containerName="ceilometer-central-agent" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.721209 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.725215 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.725587 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.725793 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.728918 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.833740 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:10 crc kubenswrapper[4810]: E1003 09:09:10.834608 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-42dbl log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="dbb00a69-659c-40c9-8e47-99734b082ada" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836096 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836154 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836215 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836278 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836303 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836356 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836374 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.836419 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42dbl\" (UniqueName: \"kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938490 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938548 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938631 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42dbl\" (UniqueName: \"kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938700 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938762 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938790 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938875 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.938923 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.939998 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.940320 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.943694 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.943745 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.944420 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.944625 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.947328 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:10 crc kubenswrapper[4810]: I1003 09:09:10.957357 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42dbl\" (UniqueName: \"kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl\") pod \"ceilometer-0\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " pod="openstack/ceilometer-0" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052426 4810 generic.go:334] "Generic (PLEG): container finished" podID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerID="174509e4219e0381a84f0372d35f36735d50fdcec74ef46e9f0bf98e4cba9271" exitCode=0 Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052738 4810 generic.go:334] "Generic (PLEG): container finished" podID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerID="669345b95730081e04f1d6ce608f363ba935d66cda7b365d8047607a8f4996b7" exitCode=0 Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052747 4810 generic.go:334] "Generic (PLEG): container finished" podID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerID="fde9c096c494a54e3285ab4d7179b549419aa3a1503e807b34f59889a8727047" exitCode=0 Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052511 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerDied","Data":"174509e4219e0381a84f0372d35f36735d50fdcec74ef46e9f0bf98e4cba9271"} Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052822 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerDied","Data":"669345b95730081e04f1d6ce608f363ba935d66cda7b365d8047607a8f4996b7"} Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.052836 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerDied","Data":"fde9c096c494a54e3285ab4d7179b549419aa3a1503e807b34f59889a8727047"} Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.054213 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.065409 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.142763 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.142871 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.142949 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143169 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143196 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143224 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143267 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143295 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143665 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.143785 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42dbl\" (UniqueName: \"kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl\") pod \"dbb00a69-659c-40c9-8e47-99734b082ada\" (UID: \"dbb00a69-659c-40c9-8e47-99734b082ada\") " Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.144612 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.144636 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dbb00a69-659c-40c9-8e47-99734b082ada-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.146633 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.147061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.147412 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data" (OuterVolumeSpecName: "config-data") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.147497 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.147879 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl" (OuterVolumeSpecName: "kube-api-access-42dbl") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "kube-api-access-42dbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.148063 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts" (OuterVolumeSpecName: "scripts") pod "dbb00a69-659c-40c9-8e47-99734b082ada" (UID: "dbb00a69-659c-40c9-8e47-99734b082ada"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246869 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42dbl\" (UniqueName: \"kubernetes.io/projected/dbb00a69-659c-40c9-8e47-99734b082ada-kube-api-access-42dbl\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246918 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246929 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246938 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246974 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.246983 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbb00a69-659c-40c9-8e47-99734b082ada-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:11 crc kubenswrapper[4810]: I1003 09:09:11.318031 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c12046c-57b2-4c4e-8cac-bbc7808efaa6" path="/var/lib/kubelet/pods/2c12046c-57b2-4c4e-8cac-bbc7808efaa6/volumes" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.063296 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.123690 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.144108 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.161030 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.163546 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.165595 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.166078 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.166267 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.188260 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263698 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263755 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263847 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263870 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263905 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.263947 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjk96\" (UniqueName: \"kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.264010 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.264054 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366214 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjk96\" (UniqueName: \"kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366352 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366418 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366443 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366504 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366571 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366595 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.366639 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.368446 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.370633 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.371766 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.372274 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.373376 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.375836 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.377789 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.386395 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjk96\" (UniqueName: \"kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96\") pod \"ceilometer-0\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.490325 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:09:12 crc kubenswrapper[4810]: I1003 09:09:12.972886 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:09:12 crc kubenswrapper[4810]: W1003 09:09:12.987150 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8a0d09_74be_4ada_a91c_90a30042cc32.slice/crio-e1b5017222ecf21b85d6a7a9f987e3981f732f94e571015860f95d8d17f2d5c3 WatchSource:0}: Error finding container e1b5017222ecf21b85d6a7a9f987e3981f732f94e571015860f95d8d17f2d5c3: Status 404 returned error can't find the container with id e1b5017222ecf21b85d6a7a9f987e3981f732f94e571015860f95d8d17f2d5c3 Oct 03 09:09:13 crc kubenswrapper[4810]: I1003 09:09:13.030806 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-35f0-account-create-7x5x8"] Oct 03 09:09:13 crc kubenswrapper[4810]: I1003 09:09:13.039852 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-35f0-account-create-7x5x8"] Oct 03 09:09:13 crc kubenswrapper[4810]: I1003 09:09:13.094919 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerStarted","Data":"e1b5017222ecf21b85d6a7a9f987e3981f732f94e571015860f95d8d17f2d5c3"} Oct 03 09:09:13 crc kubenswrapper[4810]: I1003 09:09:13.314175 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932" path="/var/lib/kubelet/pods/9b8af1d9-1a3d-40e2-b33a-7fa4eac7f932/volumes" Oct 03 09:09:13 crc kubenswrapper[4810]: I1003 09:09:13.316042 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb00a69-659c-40c9-8e47-99734b082ada" path="/var/lib/kubelet/pods/dbb00a69-659c-40c9-8e47-99734b082ada/volumes" Oct 03 09:09:14 crc kubenswrapper[4810]: I1003 09:09:14.105754 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerStarted","Data":"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52"} Oct 03 09:09:14 crc kubenswrapper[4810]: I1003 09:09:14.106105 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerStarted","Data":"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8"} Oct 03 09:09:15 crc kubenswrapper[4810]: I1003 09:09:15.118020 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerStarted","Data":"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318"} Oct 03 09:09:17 crc kubenswrapper[4810]: I1003 09:09:17.146184 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerStarted","Data":"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62"} Oct 03 09:09:17 crc kubenswrapper[4810]: I1003 09:09:17.148276 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:09:17 crc kubenswrapper[4810]: I1003 09:09:17.179531 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2316762470000002 podStartE2EDuration="5.17950866s" podCreationTimestamp="2025-10-03 09:09:12 +0000 UTC" firstStartedPulling="2025-10-03 09:09:12.989745777 +0000 UTC m=+7986.416996512" lastFinishedPulling="2025-10-03 09:09:15.93757819 +0000 UTC m=+7989.364828925" observedRunningTime="2025-10-03 09:09:17.174349033 +0000 UTC m=+7990.601599768" watchObservedRunningTime="2025-10-03 09:09:17.17950866 +0000 UTC m=+7990.606759395" Oct 03 09:09:19 crc kubenswrapper[4810]: I1003 09:09:19.303087 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:09:19 crc kubenswrapper[4810]: E1003 09:09:19.303701 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:09:24 crc kubenswrapper[4810]: I1003 09:09:24.033888 4810 scope.go:117] "RemoveContainer" containerID="a9076a758ff7aef766a584d973bb5be65b9283fd5090eea5f462ca0f42d6083f" Oct 03 09:09:24 crc kubenswrapper[4810]: I1003 09:09:24.060015 4810 scope.go:117] "RemoveContainer" containerID="a83b74c1210a375c8424e845c6ab6a049c9e6f21598bca2ba113f0fff9f45b9d" Oct 03 09:09:24 crc kubenswrapper[4810]: I1003 09:09:24.085535 4810 scope.go:117] "RemoveContainer" containerID="2d8466c62ce7e0b9940f0c67d89c783acd7ad0cbc026a4854d06e731b5f4d488" Oct 03 09:09:24 crc kubenswrapper[4810]: I1003 09:09:24.138597 4810 scope.go:117] "RemoveContainer" containerID="ec87bc4bd641aeabbf619cd1b5bca548318cb67371930db7ae78db54cb17b90b" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.432547 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.437062 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.450877 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.587200 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.587272 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx5b\" (UniqueName: \"kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.588041 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.690982 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.691057 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sx5b\" (UniqueName: \"kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.691181 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.692497 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.694599 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.711620 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sx5b\" (UniqueName: \"kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b\") pod \"certified-operators-w6tzj\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:30 crc kubenswrapper[4810]: I1003 09:09:30.769586 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:31 crc kubenswrapper[4810]: I1003 09:09:31.279361 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:31 crc kubenswrapper[4810]: W1003 09:09:31.290006 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf59b393_7cf5_49e9_8610_3f897375f6c5.slice/crio-2ef638fdc14dd7375618fc6fdf05ad2740a50bd2fdfa6ca883020e69810cceab WatchSource:0}: Error finding container 2ef638fdc14dd7375618fc6fdf05ad2740a50bd2fdfa6ca883020e69810cceab: Status 404 returned error can't find the container with id 2ef638fdc14dd7375618fc6fdf05ad2740a50bd2fdfa6ca883020e69810cceab Oct 03 09:09:32 crc kubenswrapper[4810]: I1003 09:09:32.312333 4810 generic.go:334] "Generic (PLEG): container finished" podID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerID="0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7" exitCode=0 Oct 03 09:09:32 crc kubenswrapper[4810]: I1003 09:09:32.312479 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerDied","Data":"0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7"} Oct 03 09:09:32 crc kubenswrapper[4810]: I1003 09:09:32.312938 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerStarted","Data":"2ef638fdc14dd7375618fc6fdf05ad2740a50bd2fdfa6ca883020e69810cceab"} Oct 03 09:09:33 crc kubenswrapper[4810]: I1003 09:09:33.302906 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:09:33 crc kubenswrapper[4810]: E1003 09:09:33.303579 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:09:33 crc kubenswrapper[4810]: I1003 09:09:33.326517 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerStarted","Data":"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773"} Oct 03 09:09:35 crc kubenswrapper[4810]: I1003 09:09:35.349020 4810 generic.go:334] "Generic (PLEG): container finished" podID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerID="91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773" exitCode=0 Oct 03 09:09:35 crc kubenswrapper[4810]: I1003 09:09:35.349436 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerDied","Data":"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773"} Oct 03 09:09:36 crc kubenswrapper[4810]: I1003 09:09:36.056303 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-c4r7k"] Oct 03 09:09:36 crc kubenswrapper[4810]: I1003 09:09:36.067883 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-c4r7k"] Oct 03 09:09:36 crc kubenswrapper[4810]: I1003 09:09:36.363168 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerStarted","Data":"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648"} Oct 03 09:09:36 crc kubenswrapper[4810]: I1003 09:09:36.389849 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w6tzj" podStartSLOduration=2.592241445 podStartE2EDuration="6.38983277s" podCreationTimestamp="2025-10-03 09:09:30 +0000 UTC" firstStartedPulling="2025-10-03 09:09:32.316245302 +0000 UTC m=+8005.743496037" lastFinishedPulling="2025-10-03 09:09:36.113836627 +0000 UTC m=+8009.541087362" observedRunningTime="2025-10-03 09:09:36.385421233 +0000 UTC m=+8009.812671968" watchObservedRunningTime="2025-10-03 09:09:36.38983277 +0000 UTC m=+8009.817083505" Oct 03 09:09:37 crc kubenswrapper[4810]: I1003 09:09:37.319926 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3091bacd-d5fb-4b1a-be28-1d3ae31ff392" path="/var/lib/kubelet/pods/3091bacd-d5fb-4b1a-be28-1d3ae31ff392/volumes" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.413177 4810 generic.go:334] "Generic (PLEG): container finished" podID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerID="66b96aac269e2b46cf7d98dce429f7534f686672c36e2bb0b2a6b76e17dc71e2" exitCode=137 Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.414009 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerDied","Data":"66b96aac269e2b46cf7d98dce429f7534f686672c36e2bb0b2a6b76e17dc71e2"} Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.527227 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.658828 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts\") pod \"8f245998-6bd9-4933-ae52-3225dd7a5033\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.659097 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data\") pod \"8f245998-6bd9-4933-ae52-3225dd7a5033\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.659200 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cjr2\" (UniqueName: \"kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2\") pod \"8f245998-6bd9-4933-ae52-3225dd7a5033\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.659268 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle\") pod \"8f245998-6bd9-4933-ae52-3225dd7a5033\" (UID: \"8f245998-6bd9-4933-ae52-3225dd7a5033\") " Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.666106 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2" (OuterVolumeSpecName: "kube-api-access-6cjr2") pod "8f245998-6bd9-4933-ae52-3225dd7a5033" (UID: "8f245998-6bd9-4933-ae52-3225dd7a5033"). InnerVolumeSpecName "kube-api-access-6cjr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.666671 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts" (OuterVolumeSpecName: "scripts") pod "8f245998-6bd9-4933-ae52-3225dd7a5033" (UID: "8f245998-6bd9-4933-ae52-3225dd7a5033"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.762498 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cjr2\" (UniqueName: \"kubernetes.io/projected/8f245998-6bd9-4933-ae52-3225dd7a5033-kube-api-access-6cjr2\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.762537 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.770516 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.771726 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.784722 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f245998-6bd9-4933-ae52-3225dd7a5033" (UID: "8f245998-6bd9-4933-ae52-3225dd7a5033"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.788885 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data" (OuterVolumeSpecName: "config-data") pod "8f245998-6bd9-4933-ae52-3225dd7a5033" (UID: "8f245998-6bd9-4933-ae52-3225dd7a5033"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.824545 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.864383 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:40 crc kubenswrapper[4810]: I1003 09:09:40.864682 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f245998-6bd9-4933-ae52-3225dd7a5033-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.427323 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.427327 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8f245998-6bd9-4933-ae52-3225dd7a5033","Type":"ContainerDied","Data":"aead3a318e51fc03138ca4f55980e4b16838a5974d8b9b5208ca8798e602345a"} Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.427413 4810 scope.go:117] "RemoveContainer" containerID="66b96aac269e2b46cf7d98dce429f7534f686672c36e2bb0b2a6b76e17dc71e2" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.465768 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.479764 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.485332 4810 scope.go:117] "RemoveContainer" containerID="174509e4219e0381a84f0372d35f36735d50fdcec74ef46e9f0bf98e4cba9271" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.497263 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:41 crc kubenswrapper[4810]: E1003 09:09:41.497755 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-listener" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.497775 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-listener" Oct 03 09:09:41 crc kubenswrapper[4810]: E1003 09:09:41.497788 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-api" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.497796 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-api" Oct 03 09:09:41 crc kubenswrapper[4810]: E1003 09:09:41.497834 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-evaluator" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.497842 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-evaluator" Oct 03 09:09:41 crc kubenswrapper[4810]: E1003 09:09:41.497855 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-notifier" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.497861 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-notifier" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.498106 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-notifier" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.498129 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-listener" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.498147 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-api" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.498175 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" containerName="aodh-evaluator" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.502141 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.504991 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2pdj4" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.509166 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.509318 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.509862 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.511016 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.526206 4810 scope.go:117] "RemoveContainer" containerID="669345b95730081e04f1d6ce608f363ba935d66cda7b365d8047607a8f4996b7" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.526354 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.528497 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.559578 4810 scope.go:117] "RemoveContainer" containerID="fde9c096c494a54e3285ab4d7179b549419aa3a1503e807b34f59889a8727047" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579090 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579169 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579220 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579263 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psh77\" (UniqueName: \"kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579288 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.579481 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.682714 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.682837 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.682914 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.682963 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psh77\" (UniqueName: \"kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.682992 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.683242 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.689206 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.689206 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.689926 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.690338 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.690351 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.702788 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psh77\" (UniqueName: \"kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77\") pod \"aodh-0\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " pod="openstack/aodh-0" Oct 03 09:09:41 crc kubenswrapper[4810]: I1003 09:09:41.832744 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:09:42 crc kubenswrapper[4810]: I1003 09:09:42.297333 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 03 09:09:42 crc kubenswrapper[4810]: W1003 09:09:42.307545 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a54421e_d5ab_4e38_a6da_d90860a2b3b3.slice/crio-dc36f29d4851ccdd10540d1952a6da61ea70f4d8b54c849a51e8c9416da5792b WatchSource:0}: Error finding container dc36f29d4851ccdd10540d1952a6da61ea70f4d8b54c849a51e8c9416da5792b: Status 404 returned error can't find the container with id dc36f29d4851ccdd10540d1952a6da61ea70f4d8b54c849a51e8c9416da5792b Oct 03 09:09:42 crc kubenswrapper[4810]: I1003 09:09:42.440975 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerStarted","Data":"dc36f29d4851ccdd10540d1952a6da61ea70f4d8b54c849a51e8c9416da5792b"} Oct 03 09:09:42 crc kubenswrapper[4810]: I1003 09:09:42.503789 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 09:09:43 crc kubenswrapper[4810]: I1003 09:09:43.225477 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:43 crc kubenswrapper[4810]: I1003 09:09:43.317349 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f245998-6bd9-4933-ae52-3225dd7a5033" path="/var/lib/kubelet/pods/8f245998-6bd9-4933-ae52-3225dd7a5033/volumes" Oct 03 09:09:43 crc kubenswrapper[4810]: I1003 09:09:43.456255 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerStarted","Data":"08cd416b8eeef505b3e0b38ba8da9d068bedaeea3f4d831442ca091390459861"} Oct 03 09:09:44 crc kubenswrapper[4810]: I1003 09:09:44.303561 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:09:44 crc kubenswrapper[4810]: E1003 09:09:44.304567 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:09:44 crc kubenswrapper[4810]: I1003 09:09:44.472167 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerStarted","Data":"c978bb239f34435050aeea1f40571c0760c669b74d2a1948cabd533429556d2a"} Oct 03 09:09:44 crc kubenswrapper[4810]: I1003 09:09:44.472234 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerStarted","Data":"5440941186e8bdddd5f88722a1e8ed75a86f824038cb407b01408da32a0da878"} Oct 03 09:09:44 crc kubenswrapper[4810]: I1003 09:09:44.472323 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w6tzj" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="registry-server" containerID="cri-o://6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648" gracePeriod=2 Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.331959 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.465382 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities\") pod \"df59b393-7cf5-49e9-8610-3f897375f6c5\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.465793 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content\") pod \"df59b393-7cf5-49e9-8610-3f897375f6c5\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.465911 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sx5b\" (UniqueName: \"kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b\") pod \"df59b393-7cf5-49e9-8610-3f897375f6c5\" (UID: \"df59b393-7cf5-49e9-8610-3f897375f6c5\") " Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.466268 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities" (OuterVolumeSpecName: "utilities") pod "df59b393-7cf5-49e9-8610-3f897375f6c5" (UID: "df59b393-7cf5-49e9-8610-3f897375f6c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.466856 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.472488 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b" (OuterVolumeSpecName: "kube-api-access-5sx5b") pod "df59b393-7cf5-49e9-8610-3f897375f6c5" (UID: "df59b393-7cf5-49e9-8610-3f897375f6c5"). InnerVolumeSpecName "kube-api-access-5sx5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.485549 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerStarted","Data":"2ba2dbc8167fc1bd6f1b2a1f51b4d2aeb50ad0c06902101cb2b2d5cf0dcefff2"} Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.490688 4810 generic.go:334] "Generic (PLEG): container finished" podID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerID="6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648" exitCode=0 Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.490734 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerDied","Data":"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648"} Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.490764 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6tzj" event={"ID":"df59b393-7cf5-49e9-8610-3f897375f6c5","Type":"ContainerDied","Data":"2ef638fdc14dd7375618fc6fdf05ad2740a50bd2fdfa6ca883020e69810cceab"} Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.490784 4810 scope.go:117] "RemoveContainer" containerID="6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.490789 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6tzj" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.515088 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.22321836 podStartE2EDuration="4.515068579s" podCreationTimestamp="2025-10-03 09:09:41 +0000 UTC" firstStartedPulling="2025-10-03 09:09:42.311956385 +0000 UTC m=+8015.739207120" lastFinishedPulling="2025-10-03 09:09:44.603806604 +0000 UTC m=+8018.031057339" observedRunningTime="2025-10-03 09:09:45.512652964 +0000 UTC m=+8018.939903709" watchObservedRunningTime="2025-10-03 09:09:45.515068579 +0000 UTC m=+8018.942319314" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.547044 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df59b393-7cf5-49e9-8610-3f897375f6c5" (UID: "df59b393-7cf5-49e9-8610-3f897375f6c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.567530 4810 scope.go:117] "RemoveContainer" containerID="91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.575486 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df59b393-7cf5-49e9-8610-3f897375f6c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.575521 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sx5b\" (UniqueName: \"kubernetes.io/projected/df59b393-7cf5-49e9-8610-3f897375f6c5-kube-api-access-5sx5b\") on node \"crc\" DevicePath \"\"" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.646163 4810 scope.go:117] "RemoveContainer" containerID="0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.716177 4810 scope.go:117] "RemoveContainer" containerID="6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648" Oct 03 09:09:45 crc kubenswrapper[4810]: E1003 09:09:45.717005 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648\": container with ID starting with 6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648 not found: ID does not exist" containerID="6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.717059 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648"} err="failed to get container status \"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648\": rpc error: code = NotFound desc = could not find container \"6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648\": container with ID starting with 6334b6668b4cf799f4a11f10e62626e2e43a0351e6091edc37d819dbbe5f4648 not found: ID does not exist" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.717097 4810 scope.go:117] "RemoveContainer" containerID="91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773" Oct 03 09:09:45 crc kubenswrapper[4810]: E1003 09:09:45.720323 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773\": container with ID starting with 91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773 not found: ID does not exist" containerID="91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.720372 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773"} err="failed to get container status \"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773\": rpc error: code = NotFound desc = could not find container \"91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773\": container with ID starting with 91d7705aba8ad140eb647d1ab03716d1e0a973500fb334ca0757278b90200773 not found: ID does not exist" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.720407 4810 scope.go:117] "RemoveContainer" containerID="0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7" Oct 03 09:09:45 crc kubenswrapper[4810]: E1003 09:09:45.720854 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7\": container with ID starting with 0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7 not found: ID does not exist" containerID="0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.723651 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7"} err="failed to get container status \"0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7\": rpc error: code = NotFound desc = could not find container \"0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7\": container with ID starting with 0f7cc777df881e4c05391e4dc00c2a4be1ad1a38cc833e81ff8d605dca8f30d7 not found: ID does not exist" Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.829425 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:45 crc kubenswrapper[4810]: I1003 09:09:45.841180 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w6tzj"] Oct 03 09:09:47 crc kubenswrapper[4810]: I1003 09:09:47.346474 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" path="/var/lib/kubelet/pods/df59b393-7cf5-49e9-8610-3f897375f6c5/volumes" Oct 03 09:09:58 crc kubenswrapper[4810]: I1003 09:09:58.303200 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:09:58 crc kubenswrapper[4810]: E1003 09:09:58.303983 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:10:06 crc kubenswrapper[4810]: I1003 09:10:06.045917 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lxxxj"] Oct 03 09:10:06 crc kubenswrapper[4810]: I1003 09:10:06.081346 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lxxxj"] Oct 03 09:10:07 crc kubenswrapper[4810]: I1003 09:10:07.318503 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95e2e2a-9340-4300-ac33-887767f7e5cf" path="/var/lib/kubelet/pods/d95e2e2a-9340-4300-ac33-887767f7e5cf/volumes" Oct 03 09:10:10 crc kubenswrapper[4810]: I1003 09:10:10.303220 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:10:10 crc kubenswrapper[4810]: E1003 09:10:10.303883 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:10:15 crc kubenswrapper[4810]: I1003 09:10:15.030774 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bfbb-account-create-q9qdc"] Oct 03 09:10:15 crc kubenswrapper[4810]: I1003 09:10:15.042618 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bfbb-account-create-q9qdc"] Oct 03 09:10:15 crc kubenswrapper[4810]: I1003 09:10:15.321003 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c121bd-7601-4845-8b51-c72706f78cbe" path="/var/lib/kubelet/pods/65c121bd-7601-4845-8b51-c72706f78cbe/volumes" Oct 03 09:10:23 crc kubenswrapper[4810]: I1003 09:10:23.303712 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:10:23 crc kubenswrapper[4810]: E1003 09:10:23.305104 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:10:24 crc kubenswrapper[4810]: I1003 09:10:24.319191 4810 scope.go:117] "RemoveContainer" containerID="1b9be289c09724a6a864c0593412c4c8e33195aca6ca4aaa5b91e450f9ef7bb2" Oct 03 09:10:24 crc kubenswrapper[4810]: I1003 09:10:24.375148 4810 scope.go:117] "RemoveContainer" containerID="a1a04de7847d46fbcd88ab5dbf418d466bbf53cbbc2ab4300ab5479a4f000cdb" Oct 03 09:10:24 crc kubenswrapper[4810]: I1003 09:10:24.478455 4810 scope.go:117] "RemoveContainer" containerID="0f21340868b9429edd2728b912b4f49b1e34b40140f6b795bd7c4f57f73d6f69" Oct 03 09:10:26 crc kubenswrapper[4810]: I1003 09:10:26.046346 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hkm7m"] Oct 03 09:10:26 crc kubenswrapper[4810]: I1003 09:10:26.055551 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hkm7m"] Oct 03 09:10:27 crc kubenswrapper[4810]: I1003 09:10:27.319149 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38" path="/var/lib/kubelet/pods/7fe7112b-a8c0-4c69-bb2a-6f322ec0ac38/volumes" Oct 03 09:10:34 crc kubenswrapper[4810]: I1003 09:10:34.302176 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:10:35 crc kubenswrapper[4810]: I1003 09:10:35.057567 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0"} Oct 03 09:11:21 crc kubenswrapper[4810]: I1003 09:11:21.060499 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vxq2p"] Oct 03 09:11:21 crc kubenswrapper[4810]: I1003 09:11:21.072393 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vxq2p"] Oct 03 09:11:21 crc kubenswrapper[4810]: I1003 09:11:21.315092 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d118cd8-5e30-48cc-aee4-db15dfb65704" path="/var/lib/kubelet/pods/1d118cd8-5e30-48cc-aee4-db15dfb65704/volumes" Oct 03 09:11:22 crc kubenswrapper[4810]: I1003 09:11:22.028277 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tfcdh"] Oct 03 09:11:22 crc kubenswrapper[4810]: I1003 09:11:22.040221 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-s9mvg"] Oct 03 09:11:22 crc kubenswrapper[4810]: I1003 09:11:22.051019 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tfcdh"] Oct 03 09:11:22 crc kubenswrapper[4810]: I1003 09:11:22.061461 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-s9mvg"] Oct 03 09:11:23 crc kubenswrapper[4810]: I1003 09:11:23.313883 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2d06bb-19a4-4aa4-aec7-783a567726db" path="/var/lib/kubelet/pods/3e2d06bb-19a4-4aa4-aec7-783a567726db/volumes" Oct 03 09:11:23 crc kubenswrapper[4810]: I1003 09:11:23.314444 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b1790a9-927d-4618-9eb7-c5ced54af4a9" path="/var/lib/kubelet/pods/4b1790a9-927d-4618-9eb7-c5ced54af4a9/volumes" Oct 03 09:11:24 crc kubenswrapper[4810]: I1003 09:11:24.600510 4810 scope.go:117] "RemoveContainer" containerID="d701b87a14577f87b03e4fa3f6ca9e1ae7965d82f527c984c868c7194359f6a3" Oct 03 09:11:24 crc kubenswrapper[4810]: I1003 09:11:24.640950 4810 scope.go:117] "RemoveContainer" containerID="cf66cf254afe632778a8748f2f6bcac4a8b8451ec367729eb4ff712c93100586" Oct 03 09:11:24 crc kubenswrapper[4810]: I1003 09:11:24.675958 4810 scope.go:117] "RemoveContainer" containerID="43165b7a64643ef3cab4e29178f0beafafa146c94e61751d37588312b350db8e" Oct 03 09:11:24 crc kubenswrapper[4810]: I1003 09:11:24.723109 4810 scope.go:117] "RemoveContainer" containerID="040748a82a5e2224d0965fac2adc7254c91b10c0aa83bf36ab8b1bd8a548c542" Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.042248 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d0e8-account-create-f4k84"] Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.060891 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3fd2-account-create-qmhk4"] Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.070433 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6644-account-create-pflxp"] Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.078306 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3fd2-account-create-qmhk4"] Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.086774 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6644-account-create-pflxp"] Oct 03 09:11:32 crc kubenswrapper[4810]: I1003 09:11:32.096848 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d0e8-account-create-f4k84"] Oct 03 09:11:33 crc kubenswrapper[4810]: I1003 09:11:33.316553 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60dae612-9f86-4dea-a7ca-c641b3622c7d" path="/var/lib/kubelet/pods/60dae612-9f86-4dea-a7ca-c641b3622c7d/volumes" Oct 03 09:11:33 crc kubenswrapper[4810]: I1003 09:11:33.317399 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b53b561d-83d8-4ca0-bfaa-a3f8473373fa" path="/var/lib/kubelet/pods/b53b561d-83d8-4ca0-bfaa-a3f8473373fa/volumes" Oct 03 09:11:33 crc kubenswrapper[4810]: I1003 09:11:33.317969 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dda5331c-3fec-4b0e-85da-2783fc44b5a2" path="/var/lib/kubelet/pods/dda5331c-3fec-4b0e-85da-2783fc44b5a2/volumes" Oct 03 09:11:50 crc kubenswrapper[4810]: I1003 09:11:50.037430 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qcck4"] Oct 03 09:11:50 crc kubenswrapper[4810]: I1003 09:11:50.046522 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qcck4"] Oct 03 09:11:51 crc kubenswrapper[4810]: I1003 09:11:51.319490 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a4752c-c640-4ea1-9044-a70c59b85c50" path="/var/lib/kubelet/pods/57a4752c-c640-4ea1-9044-a70c59b85c50/volumes" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.614134 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.615080 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" containerName="openstackclient" containerID="cri-o://38f4857d992be3b22e9531eec0f7a8496e516e8d7855074bc7163e25e6d53fa8" gracePeriod=2 Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.639107 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.712454 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.713992 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-log" containerID="cri-o://8e41db6129ebb440e14b354aafb38602fc26eb206790f69566dcdbdb50d588a8" gracePeriod=30 Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.714170 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-httpd" containerID="cri-o://787a3821a90f8b366d38035e7db82d3692844c42b7d2bf85811ba83389e1173a" gracePeriod=30 Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890238 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-76bc9bd849-gxqg6"] Oct 03 09:12:02 crc kubenswrapper[4810]: E1003 09:12:02.890633 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="extract-utilities" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890644 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="extract-utilities" Oct 03 09:12:02 crc kubenswrapper[4810]: E1003 09:12:02.890655 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="registry-server" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890661 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="registry-server" Oct 03 09:12:02 crc kubenswrapper[4810]: E1003 09:12:02.890690 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" containerName="openstackclient" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890696 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" containerName="openstackclient" Oct 03 09:12:02 crc kubenswrapper[4810]: E1003 09:12:02.890723 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="extract-content" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890728 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="extract-content" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890933 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="df59b393-7cf5-49e9-8610-3f897375f6c5" containerName="registry-server" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.890954 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" containerName="openstackclient" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.891701 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.944261 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76bc9bd849-gxqg6"] Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.967617 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerID="8e41db6129ebb440e14b354aafb38602fc26eb206790f69566dcdbdb50d588a8" exitCode=143 Oct 03 09:12:02 crc kubenswrapper[4810]: I1003 09:12:02.967673 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerDied","Data":"8e41db6129ebb440e14b354aafb38602fc26eb206790f69566dcdbdb50d588a8"} Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.066668 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.125724 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.125863 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.125953 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.125992 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.200543 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.201978 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.225501 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.230525 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.230585 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.231640 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.232076 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.232316 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6dcd\" (UniqueName: \"kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd\") pod \"glance35f0-account-delete-h8t2h\" (UID: \"a7e14d67-c732-445d-b7c7-ffb607ff1d3a\") " pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.234528 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.234585 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:03.734567926 +0000 UTC m=+8157.161818661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.235019 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.235052 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data podName:849f27b3-b111-4722-9e85-528f2fbed78d nodeName:}" failed. No retries permitted until 2025-10-03 09:12:03.735042539 +0000 UTC m=+8157.162293354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data") pod "rabbitmq-server-0" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d") : configmap "rabbitmq-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.244803 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.246655 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.260119 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.268808 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.282208 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.282285 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:03.78226437 +0000 UTC m=+8157.209515105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.336602 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6dcd\" (UniqueName: \"kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd\") pod \"glance35f0-account-delete-h8t2h\" (UID: \"a7e14d67-c732-445d-b7c7-ffb607ff1d3a\") " pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.412378 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.412420 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.412431 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.419459 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="openstack-network-exporter" containerID="cri-o://7bb028b7a5bb063820d9ba2accea2471b634b11658dc72ef39f28643120fa6d7" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.419846 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="openstack-network-exporter" containerID="cri-o://a3d3c9e0db1422a18299e83c0a317086eace0fbe45d5f64b129666d18947412f" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.437085 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6dcd\" (UniqueName: \"kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd\") pod \"glance35f0-account-delete-h8t2h\" (UID: \"a7e14d67-c732-445d-b7c7-ffb607ff1d3a\") " pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.447645 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjkr\" (UniqueName: \"kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr\") pod \"heatd134-account-delete-5svfk\" (UID: \"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9\") " pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.452047 4810 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.452121 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:03.952101714 +0000 UTC m=+8157.379352449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.452191 4810 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.452215 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:03.952208287 +0000 UTC m=+8157.379459022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-scripts" not found Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.469174 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.469850 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="openstack-network-exporter" containerID="cri-o://9e8625267188d3164afef69fbc3a4e9502bd91a19d31086839f399a5a43e1daa" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.515938 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.528719 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.541355 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.549656 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjkr\" (UniqueName: \"kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr\") pod \"heatd134-account-delete-5svfk\" (UID: \"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9\") " pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.633612 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.644994 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjkr\" (UniqueName: \"kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr\") pod \"heatd134-account-delete-5svfk\" (UID: \"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9\") " pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.651654 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9ct\" (UniqueName: \"kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct\") pod \"cinder4b2d-account-delete-5zfwv\" (UID: \"a0ff0594-a3c0-4939-8113-d4611d0a53ce\") " pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.708006 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="ovsdbserver-nb" containerID="cri-o://6c91709fb1869d8c7dcdbb6f88a006eabccffee810f357690eff8446064838c1" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.715598 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.716509 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="openstack-network-exporter" containerID="cri-o://0e74de867618f048890730ae1b8ae54e811efdf031ec6aee30450263081817ad" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.721696 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.754826 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.755536 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9ct\" (UniqueName: \"kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct\") pod \"cinder4b2d-account-delete-5zfwv\" (UID: \"a0ff0594-a3c0-4939-8113-d4611d0a53ce\") " pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.756165 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.756215 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.756201383 +0000 UTC m=+8158.183452118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.756515 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.756555 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data podName:849f27b3-b111-4722-9e85-528f2fbed78d nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.756544833 +0000 UTC m=+8158.183795568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data") pod "rabbitmq-server-0" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d") : configmap "rabbitmq-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.773655 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.774170 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="openstack-network-exporter" containerID="cri-o://1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.814017 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.815522 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9ct\" (UniqueName: \"kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct\") pod \"cinder4b2d-account-delete-5zfwv\" (UID: \"a0ff0594-a3c0-4939-8113-d4611d0a53ce\") " pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.815963 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="openstack-network-exporter" containerID="cri-o://9f40d85da6ecfa7849d414ba4e3f550a824b8bba334f771ee2af8a049f0905e5" gracePeriod=300 Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.858972 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.859453 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.861247 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.877777 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.877841 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.87782163 +0000 UTC m=+8158.305072365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.892530 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.908034 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.925955 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.927367 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.946257 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.947756 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:03 crc kubenswrapper[4810]: I1003 09:12:03.970707 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhrg\" (UniqueName: \"kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg\") pod \"barbican6011-account-delete-8mtn8\" (UID: \"699b1f34-ccf2-4905-886d-10afe56a758f\") " pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.971026 4810 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.971076 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.97106151 +0000 UTC m=+8158.398312245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-scripts" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.971399 4810 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Oct 03 09:12:03 crc kubenswrapper[4810]: E1003 09:12:03.971432 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.9714234 +0000 UTC m=+8158.398674135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.005965 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-7474k"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.041503 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-7474k"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.067819 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e5f34fe0-9224-4e20-ba36-8047c9fe43b0/ovsdbserver-nb/0.log" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.068088 4810 generic.go:334] "Generic (PLEG): container finished" podID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerID="7bb028b7a5bb063820d9ba2accea2471b634b11658dc72ef39f28643120fa6d7" exitCode=2 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.068107 4810 generic.go:334] "Generic (PLEG): container finished" podID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerID="6c91709fb1869d8c7dcdbb6f88a006eabccffee810f357690eff8446064838c1" exitCode=143 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.068171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerDied","Data":"7bb028b7a5bb063820d9ba2accea2471b634b11658dc72ef39f28643120fa6d7"} Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.068196 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerDied","Data":"6c91709fb1869d8c7dcdbb6f88a006eabccffee810f357690eff8446064838c1"} Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.072439 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-294xk\" (UniqueName: \"kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk\") pod \"placementbfbb-account-delete-nxskb\" (UID: \"e600d090-3ebc-4552-a5d6-0c3de54c5de3\") " pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.072696 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhrg\" (UniqueName: \"kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg\") pod \"barbican6011-account-delete-8mtn8\" (UID: \"699b1f34-ccf2-4905-886d-10afe56a758f\") " pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.074299 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.074348 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data podName:e9436321-8243-4a4a-b07a-b14668a07f1f nodeName:}" failed. No retries permitted until 2025-10-03 09:12:04.574333868 +0000 UTC m=+8158.001584603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data") pod "rabbitmq-cell1-server-0" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f") : configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.077502 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.080418 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.085240 4810 generic.go:334] "Generic (PLEG): container finished" podID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerID="9e8625267188d3164afef69fbc3a4e9502bd91a19d31086839f399a5a43e1daa" exitCode=2 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.085323 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerDied","Data":"9e8625267188d3164afef69fbc3a4e9502bd91a19d31086839f399a5a43e1daa"} Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.095714 4810 generic.go:334] "Generic (PLEG): container finished" podID="38687559-9344-403e-9762-2a737c197c66" containerID="0e74de867618f048890730ae1b8ae54e811efdf031ec6aee30450263081817ad" exitCode=2 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.095776 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerDied","Data":"0e74de867618f048890730ae1b8ae54e811efdf031ec6aee30450263081817ad"} Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.099224 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerID="a3d3c9e0db1422a18299e83c0a317086eace0fbe45d5f64b129666d18947412f" exitCode=2 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.101370 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerDied","Data":"a3d3c9e0db1422a18299e83c0a317086eace0fbe45d5f64b129666d18947412f"} Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.168617 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhrg\" (UniqueName: \"kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg\") pod \"barbican6011-account-delete-8mtn8\" (UID: \"699b1f34-ccf2-4905-886d-10afe56a758f\") " pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.169430 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="ovsdbserver-sb" containerID="cri-o://ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" gracePeriod=300 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.175591 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf77\" (UniqueName: \"kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77\") pod \"neutronf221-account-delete-pkq85\" (UID: \"b0e6d09e-815b-4c16-ae3e-79550f72b352\") " pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.175746 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-294xk\" (UniqueName: \"kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk\") pod \"placementbfbb-account-delete-nxskb\" (UID: \"e600d090-3ebc-4552-a5d6-0c3de54c5de3\") " pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.183266 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.205282 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-1" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="ovsdbserver-sb" containerID="cri-o://c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" gracePeriod=300 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.209637 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-294xk\" (UniqueName: \"kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk\") pod \"placementbfbb-account-delete-nxskb\" (UID: \"e600d090-3ebc-4552-a5d6-0c3de54c5de3\") " pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.213620 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.216427 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.234202 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.277981 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.279398 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.279688 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf77\" (UniqueName: \"kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77\") pod \"neutronf221-account-delete-pkq85\" (UID: \"b0e6d09e-815b-4c16-ae3e-79550f72b352\") " pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.298064 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3 is running failed: container process not found" containerID="c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.298837 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3 is running failed: container process not found" containerID="c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.298854 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf77\" (UniqueName: \"kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77\") pod \"neutronf221-account-delete-pkq85\" (UID: \"b0e6d09e-815b-4c16-ae3e-79550f72b352\") " pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.305026 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.305564 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3 is running failed: container process not found" containerID="c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.305616 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-1" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="ovsdbserver-sb" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.318192 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.319277 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1 is running failed: container process not found" containerID="ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.322105 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.322540 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" containerID="cri-o://68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.322929 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="openstack-network-exporter" containerID="cri-o://1de9cdf0c83a3fe94c13ec3cd20f602220b80934411f5a0d8ab59ac3fce98075" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.323131 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1 is running failed: container process not found" containerID="ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.323160 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="ovsdbserver-sb" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.402003 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvtgt\" (UniqueName: \"kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt\") pod \"novacell0d0e8-account-delete-2dtvp\" (UID: \"19f98119-f52a-4cea-a154-7380157f3720\") " pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.402385 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq4th\" (UniqueName: \"kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th\") pod \"novacell13fd2-account-delete-ggrsp\" (UID: \"03d4344e-bfcb-42d1-8bcb-6924d24ae77b\") " pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.509730 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq4th\" (UniqueName: \"kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th\") pod \"novacell13fd2-account-delete-ggrsp\" (UID: \"03d4344e-bfcb-42d1-8bcb-6924d24ae77b\") " pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.509818 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvtgt\" (UniqueName: \"kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt\") pod \"novacell0d0e8-account-delete-2dtvp\" (UID: \"19f98119-f52a-4cea-a154-7380157f3720\") " pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.536827 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvtgt\" (UniqueName: \"kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt\") pod \"novacell0d0e8-account-delete-2dtvp\" (UID: \"19f98119-f52a-4cea-a154-7380157f3720\") " pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.547026 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq4th\" (UniqueName: \"kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th\") pod \"novacell13fd2-account-delete-ggrsp\" (UID: \"03d4344e-bfcb-42d1-8bcb-6924d24ae77b\") " pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.612524 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.613011 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data podName:e9436321-8243-4a4a-b07a-b14668a07f1f nodeName:}" failed. No retries permitted until 2025-10-03 09:12:05.612989409 +0000 UTC m=+8159.040240144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data") pod "rabbitmq-cell1-server-0" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f") : configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.642877 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.643187 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon-log" containerID="cri-o://14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.643808 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" containerID="cri-o://a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.742232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.742505 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-log" containerID="cri-o://e5de03c1850b861978cbefa9ed2154f1aa86748c6ffcdac9186479f812fe22bd" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.743023 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-httpd" containerID="cri-o://67b54925633fb7897de4f9db6a5d84178f3f191ee9b36f058f8d0c7e2ef38c22" gracePeriod=30 Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.762262 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.818617 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.819026 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.819079 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data podName:849f27b3-b111-4722-9e85-528f2fbed78d nodeName:}" failed. No retries permitted until 2025-10-03 09:12:06.819058881 +0000 UTC m=+8160.246309616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data") pod "rabbitmq-server-0" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d") : configmap "rabbitmq-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.819579 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.819623 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:06.819614266 +0000 UTC m=+8160.246865001 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.922962 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.928905 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:04 crc kubenswrapper[4810]: E1003 09:12:04.928974 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:06.928953805 +0000 UTC m=+8160.356204540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.936911 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-k6kp8"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.946327 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-k6kp8"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.955228 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pgz6g"] Oct 03 09:12:04 crc kubenswrapper[4810]: I1003 09:12:04.961374 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pgz6g"] Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.028523 4810 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.028602 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:07.028581755 +0000 UTC m=+8160.455832490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-scripts" not found Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.028859 4810 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.028931 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:07.028913104 +0000 UTC m=+8160.456163839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-config-data" not found Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.043997 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-w96tq"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.068449 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.069026 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="cinder-scheduler" containerID="cri-o://592213b9cf15535204f9f07e1afe93eab58cb7c7e9c4720ea154fc4abf864d38" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.070271 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="probe" containerID="cri-o://60d21af689f98576a4bb81b1221ab2ccefd04ca65804bc0605a1a62f203b010e" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.097618 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-w96tq"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.140289 4810 generic.go:334] "Generic (PLEG): container finished" podID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerID="e5de03c1850b861978cbefa9ed2154f1aa86748c6ffcdac9186479f812fe22bd" exitCode=143 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.140370 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerDied","Data":"e5de03c1850b861978cbefa9ed2154f1aa86748c6ffcdac9186479f812fe22bd"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.158222 4810 generic.go:334] "Generic (PLEG): container finished" podID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerID="1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897" exitCode=2 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.158345 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerDied","Data":"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.176450 4810 generic.go:334] "Generic (PLEG): container finished" podID="db2735e6-11ae-45cc-94fe-9628468a810a" containerID="1de9cdf0c83a3fe94c13ec3cd20f602220b80934411f5a0d8ab59ac3fce98075" exitCode=2 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.176522 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerDied","Data":"1de9cdf0c83a3fe94c13ec3cd20f602220b80934411f5a0d8ab59ac3fce98075"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.190569 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance35f0-account-delete-h8t2h" event={"ID":"a7e14d67-c732-445d-b7c7-ffb607ff1d3a","Type":"ContainerStarted","Data":"59f6d6e2b21e24b80a2054579b3e0b660ce498892c06041ea8112fc3afa8cd91"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.195845 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_38687559-9344-403e-9762-2a737c197c66/ovsdbserver-sb/0.log" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.195931 4810 generic.go:334] "Generic (PLEG): container finished" podID="38687559-9344-403e-9762-2a737c197c66" containerID="ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" exitCode=143 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.196083 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerDied","Data":"ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.199233 4810 generic.go:334] "Generic (PLEG): container finished" podID="c783295d-2aa9-46e1-baad-eef94c60dc6f" containerID="38f4857d992be3b22e9531eec0f7a8496e516e8d7855074bc7163e25e6d53fa8" exitCode=137 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208054 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208410 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api-log" containerID="cri-o://efbbea635fd7c2512387b085e21c476168a7ee9b5fafe68ca05dc78b78d49a5c" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208659 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_ffece18b-f8fb-403a-8500-e00e53d9856e/ovsdbserver-sb/0.log" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208704 4810 generic.go:334] "Generic (PLEG): container finished" podID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerID="9f40d85da6ecfa7849d414ba4e3f550a824b8bba334f771ee2af8a049f0905e5" exitCode=2 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208720 4810 generic.go:334] "Generic (PLEG): container finished" podID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerID="c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" exitCode=143 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208742 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerDied","Data":"9f40d85da6ecfa7849d414ba4e3f550a824b8bba334f771ee2af8a049f0905e5"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208767 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerDied","Data":"c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3"} Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.208866 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api" containerID="cri-o://be1d419bdcaddfa86e01ac0347a626d35490d6b41d2b7976a512edfe5809af3c" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.234333 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.234694 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="dnsmasq-dns" containerID="cri-o://b15c68365da2964e4b50c0011f035005f70603978d31e1076761b396d98183c1" gracePeriod=10 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.335838 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9" path="/var/lib/kubelet/pods/1b2d6940-5268-4cf8-b1d4-f5a9cae7e1f9/volumes" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.336733 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.355075 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39603956-d605-4c0c-a342-a18f22e572bd" path="/var/lib/kubelet/pods/39603956-d605-4c0c-a342-a18f22e572bd/volumes" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.361619 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b28102f-d4e0-4348-8ab4-98be581908e2" path="/var/lib/kubelet/pods/9b28102f-d4e0-4348-8ab4-98be581908e2/volumes" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.367629 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d780816a-47b4-4278-a386-d7172d7aa0a0" path="/var/lib/kubelet/pods/d780816a-47b4-4278-a386-d7172d7aa0a0/volumes" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373329 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373383 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373396 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-mh5pr"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373468 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-mh5pr"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373631 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-api" containerID="cri-o://08cd416b8eeef505b3e0b38ba8da9d068bedaeea3f4d831442ca091390459861" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373771 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-listener" containerID="cri-o://2ba2dbc8167fc1bd6f1b2a1f51b4d2aeb50ad0c06902101cb2b2d5cf0dcefff2" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.373984 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-notifier" containerID="cri-o://c978bb239f34435050aeea1f40571c0760c669b74d2a1948cabd533429556d2a" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.374049 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-evaluator" containerID="cri-o://5440941186e8bdddd5f88722a1e8ed75a86f824038cb407b01408da32a0da878" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.374186 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-c96f8d7b-nv762" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-log" containerID="cri-o://0438a7801eb086bb2f2988fbf6a2faaadb40a8b29e9157eea6ecfa708ea3864e" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.374278 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-c96f8d7b-nv762" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-api" containerID="cri-o://aafd95e2c2156c5d62c9e52e62267e00c8cc9c2c5da26c2f67f4a5b4754d0e28" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.413000 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.446561 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.446784 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b9499669-jd95s" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-api" containerID="cri-o://b96ac21f6b885aebfbc17e90eb2244817f36b2064b8dcd70ce7c7d24cb87aa0f" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.447201 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b9499669-jd95s" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-httpd" containerID="cri-o://7f5bf483256e9e40c501144201c3dc96463a540abcc9d8379933be6e0f86f926" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.468109 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-d134-account-create-xcc2k"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.485057 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-ssk6q"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.502496 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-ssk6q"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.505204 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-d134-account-create-xcc2k"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.519011 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.527862 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.569974 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.629041 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.648872 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.656572 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:05 crc kubenswrapper[4810]: E1003 09:12:05.656625 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data podName:e9436321-8243-4a4a-b07a-b14668a07f1f nodeName:}" failed. No retries permitted until 2025-10-03 09:12:07.656612753 +0000 UTC m=+8161.083863488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data") pod "rabbitmq-cell1-server-0" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f") : configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.661391 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.676971 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.708401 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.811082 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.811712 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-577984bbd4-45rx5" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" containerID="cri-o://f0b610453374d95fa8d8b8835a740b4722435108e83542f0e16bb20b2adca98f" gracePeriod=60 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.851856 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.868187 4810 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell13fd2-account-delete-ggrsp" secret="" err="secret \"galera-openstack-cell1-dockercfg-8628r\" not found" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.868237 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.889822 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.891834 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" containerID="cri-o://180bad1c50d897c2a6c43df0a1afbfa7d6c5a60011b7aa487c86d10db9b1980d" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.890942 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" containerID="cri-o://c0451198ffb49cf3efcc829fbf4f4b7d23054fd0f095c80c41379ab2c12fc9df" gracePeriod=30 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.905288 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="rabbitmq" containerID="cri-o://5b4764a5166f67c1b430dfc61b76795284f3ddc58b5b1b1ce0e10d2c554b70f0" gracePeriod=604800 Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.986142 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e5f34fe0-9224-4e20-ba36-8047c9fe43b0/ovsdbserver-nb/0.log" Oct 03 09:12:05 crc kubenswrapper[4810]: I1003 09:12:05.986257 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.001871 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_38687559-9344-403e-9762-2a737c197c66/ovsdbserver-sb/0.log" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.001994 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.066241 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.094292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5ld\" (UniqueName: \"kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.095676 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.095767 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.095865 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.095917 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6qnz\" (UniqueName: \"kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.095980 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096021 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096061 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096104 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096156 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096183 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096215 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096740 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") pod \"38687559-9344-403e-9762-2a737c197c66\" (UID: \"38687559-9344-403e-9762-2a737c197c66\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096776 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.096811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs\") pod \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\" (UID: \"e5f34fe0-9224-4e20-ba36-8047c9fe43b0\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.113320 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config" (OuterVolumeSpecName: "config") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.117453 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts" (OuterVolumeSpecName: "scripts") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.118100 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config" (OuterVolumeSpecName: "config") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.125869 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts" (OuterVolumeSpecName: "scripts") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.126307 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld" (OuterVolumeSpecName: "kube-api-access-pk5ld") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "kube-api-access-pk5ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.136165 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz" (OuterVolumeSpecName: "kube-api-access-p6qnz") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "kube-api-access-p6qnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.137455 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.137372 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.154097 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.156319 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.174100 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.197623 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" containerID="cri-o://a51f1a244e80ed64834a6d007023e154a4199448c67d8f2e29312bbb25ff2f8c" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.198107 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" containerID="cri-o://4458f484a54244dae918accd79a7327d261f442db2d6d5731024ddb8dc509061" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.204885 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle\") pod \"c783295d-2aa9-46e1-baad-eef94c60dc6f\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.205034 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szxgg\" (UniqueName: \"kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg\") pod \"c783295d-2aa9-46e1-baad-eef94c60dc6f\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.205345 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config\") pod \"c783295d-2aa9-46e1-baad-eef94c60dc6f\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.205434 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret\") pod \"c783295d-2aa9-46e1-baad-eef94c60dc6f\" (UID: \"c783295d-2aa9-46e1-baad-eef94c60dc6f\") " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.207883 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5ld\" (UniqueName: \"kubernetes.io/projected/38687559-9344-403e-9762-2a737c197c66-kube-api-access-pk5ld\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.207972 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6qnz\" (UniqueName: \"kubernetes.io/projected/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-kube-api-access-p6qnz\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.207982 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.208014 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.208023 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.208032 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.208040 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/38687559-9344-403e-9762-2a737c197c66-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.208049 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38687559-9344-403e-9762-2a737c197c66-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.212908 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.249977 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.250390 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" containerID="cri-o://f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" gracePeriod=60 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.252353 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg" (OuterVolumeSpecName: "kube-api-access-szxgg") pod "c783295d-2aa9-46e1-baad-eef94c60dc6f" (UID: "c783295d-2aa9-46e1-baad-eef94c60dc6f"). InnerVolumeSpecName "kube-api-access-szxgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.267119 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76bc9bd849-gxqg6"] Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.270779 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config-data kube-api-access-fscfn], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/heat-engine-76bc9bd849-gxqg6" podUID="95eba70b-a04a-4ca8-8e43-0ee212328321" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.279427 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.296278 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.296506 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-httpd" containerID="cri-o://ccaddef8fdc23da0618108ad3920e7dc36166714d51af18712d008fbb1ee8cf6" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.296915 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-server" containerID="cri-o://276347e95ea2f8ef1722075a72ac28686ed4623fefd6f74a753d70385f27ad2e" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.308539 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_38687559-9344-403e-9762-2a737c197c66/ovsdbserver-sb/0.log" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.308637 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"38687559-9344-403e-9762-2a737c197c66","Type":"ContainerDied","Data":"3339f686fca5904af2854a31580acf6e1f8d8467c59651449bb5f64aed5dc2ea"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.308673 4810 scope.go:117] "RemoveContainer" containerID="0e74de867618f048890730ae1b8ae54e811efdf031ec6aee30450263081817ad" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.308807 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.311102 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szxgg\" (UniqueName: \"kubernetes.io/projected/c783295d-2aa9-46e1-baad-eef94c60dc6f-kube-api-access-szxgg\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.315344 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.315748 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-d6cf467cf-6jvjp" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerName="heat-api" containerID="cri-o://a2c2611ee5c4b7a3eb513f4e9f6d3e9dfa093b9ff621e7dcd667a7d9a57b1f3e" gracePeriod=60 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.319799 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.381463 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.381729 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7688d87799-z7jvj" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker-log" containerID="cri-o://a9eb5c4e834ec18df22b9dc93a6f7d4c539136ebfda5ef3d4283bbbf3ada1234" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.381931 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7688d87799-z7jvj" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker" containerID="cri-o://cdda3f94e72b2e0fc1363ad8dec85327b12a98239d858659e6c717c1fc77cf5f" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.390993 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "pvc-4c855458-0021-4bc5-8f03-96336c93c964". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.410167 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.411384 4810 generic.go:334] "Generic (PLEG): container finished" podID="463a9578-5093-4812-a27c-c4d849a5ec67" containerID="c0451198ffb49cf3efcc829fbf4f4b7d23054fd0f095c80c41379ab2c12fc9df" exitCode=143 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.411453 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerDied","Data":"c0451198ffb49cf3efcc829fbf4f4b7d23054fd0f095c80c41379ab2c12fc9df"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.417294 4810 generic.go:334] "Generic (PLEG): container finished" podID="990ef3ca-87e1-4b2b-8714-86263523425b" containerID="efbbea635fd7c2512387b085e21c476168a7ee9b5fafe68ca05dc78b78d49a5c" exitCode=143 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.417413 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerDied","Data":"efbbea635fd7c2512387b085e21c476168a7ee9b5fafe68ca05dc78b78d49a5c"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.439254 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-4kpwj"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.444549 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") on node \"crc\" " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.444616 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") on node \"crc\" " Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.467546 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-4kpwj"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.473615 4810 generic.go:334] "Generic (PLEG): container finished" podID="14265138-5fbe-4187-928c-fbe84d832080" containerID="0438a7801eb086bb2f2988fbf6a2faaadb40a8b29e9157eea6ecfa708ea3864e" exitCode=143 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.473717 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerDied","Data":"0438a7801eb086bb2f2988fbf6a2faaadb40a8b29e9157eea6ecfa708ea3864e"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.475879 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.476115 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6866556666-25fj4" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api-log" containerID="cri-o://68ad4f5b62f1bf62badf506a3cc48bf9e7ddd204e527495d5adf65f34b954e77" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.476538 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6866556666-25fj4" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api" containerID="cri-o://bef0d553e5791a6b7a41534f43da71d6aedce548e058b8e26b6a986f52a3616f" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.499440 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance35f0-account-delete-h8t2h" event={"ID":"a7e14d67-c732-445d-b7c7-ffb607ff1d3a","Type":"ContainerStarted","Data":"ea6defe8bf672751992101e754e07e04f6da09fc040a2ef4b978763f323a1cfa"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.526856 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e5f34fe0-9224-4e20-ba36-8047c9fe43b0/ovsdbserver-nb/0.log" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.527012 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e5f34fe0-9224-4e20-ba36-8047c9fe43b0","Type":"ContainerDied","Data":"b77877208c57c7ee2fd86d1c19759438a454c9c5f8343d78023e9fd3b557aac8"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.527094 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.539592 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.539870 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener-log" containerID="cri-o://ba8f23b00e574ed38b94b464a2550358de6f47b13a732b76199ab43ac8a36415" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.539995 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener" containerID="cri-o://e38e84bc6b6ee906a778b23d1848296550b447437e04a7e712e4ebf93b978f4a" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.567668 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-a3b8-account-create-qm2vt"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.577120 4810 generic.go:334] "Generic (PLEG): container finished" podID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerID="b15c68365da2964e4b50c0011f035005f70603978d31e1076761b396d98183c1" exitCode=0 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.577171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" event={"ID":"2246ca28-9b3f-40d7-8d8c-4feb750d5d41","Type":"ContainerDied","Data":"b15c68365da2964e4b50c0011f035005f70603978d31e1076761b396d98183c1"} Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.612406 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-a3b8-account-create-qm2vt"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.652948 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.675829 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.678680 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.686761 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.686796 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.710914 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c783295d-2aa9-46e1-baad-eef94c60dc6f" (UID: "c783295d-2aa9-46e1-baad-eef94c60dc6f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.716757 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.717027 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ba6cb84bebecb7683aa43366962d518dd88cdaed82bb6fafec8f5209c8020422" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.727912 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.728302 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="alertmanager" containerID="cri-o://66f1ba2e75970a1aba16f1afb2a73d7784ddebbe5e4717cbfe405fc0b9fc68dd" gracePeriod=120 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.728557 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/alertmanager-metric-storage-0" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="config-reloader" containerID="cri-o://43e7dc7eb5e00cbb5009d4a66ffb6aa6e9fffc55143b955e5c66ac08125d7298" gracePeriod=120 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.730467 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.105:6080/vnc_lite.html\": read tcp 10.217.0.2:36280->10.217.1.105:6080: read: connection reset by peer" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.737993 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.738392 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="prometheus" containerID="cri-o://695d874952bb7697f4f41bab1649fbdd13c3b210395aab381ef29bcdad672912" gracePeriod=600 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.738569 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="thanos-sidecar" containerID="cri-o://a140b7e92a1d9149e2d71e898a7a3c0826dc5bdb44363234e8c1a19baa36023f" gracePeriod=600 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.738627 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="config-reloader" containerID="cri-o://bfb8c46462d42f7cabbdb9785c5516092ad0178d1e18d3fc0e4dc555e3aec222" gracePeriod=600 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.765232 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xht9"] Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.770229 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.770450 4810 scope.go:117] "RemoveContainer" containerID="ed27a985c6f1c9f475ac64c592ca18fda4edc8f534d84d9c77a899f4bbcfbba1" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.770859 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.771395 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xht9"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.781444 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4c855458-0021-4bc5-8f03-96336c93c964" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964") on node "crc" Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.785919 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.789558 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.789600 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-4c855458-0021-4bc5-8f03-96336c93c964\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4c855458-0021-4bc5-8f03-96336c93c964\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.806098 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="rabbitmq" containerID="cri-o://3e4b15b2571b4b9e95c498a522370b026aa306f3f2b72b93a885f270f401f309" gracePeriod=604800 Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.816271 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.816322 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.816636 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_ffece18b-f8fb-403a-8500-e00e53d9856e/ovsdbserver-sb/0.log" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.816711 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.851398 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.877332 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.877618 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" gracePeriod=30 Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.893093 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.893303 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.893301 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.893353 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data podName:849f27b3-b111-4722-9e85-528f2fbed78d nodeName:}" failed. No retries permitted until 2025-10-03 09:12:10.893339153 +0000 UTC m=+8164.320589888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data") pod "rabbitmq-server-0" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d") : configmap "rabbitmq-config-data" not found Oct 03 09:12:06 crc kubenswrapper[4810]: E1003 09:12:06.893390 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:10.893366004 +0000 UTC m=+8164.320616809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.918118 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c783295d-2aa9-46e1-baad-eef94c60dc6f" (UID: "c783295d-2aa9-46e1-baad-eef94c60dc6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.921039 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.941371 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.941506 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d") on node "crc" Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.945040 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:12:06 crc kubenswrapper[4810]: I1003 09:12:06.945258 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" gracePeriod=30 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.014431 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.014874 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.014936 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.014973 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.014980 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.014995 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="ovsdbserver-nb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015001 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="ovsdbserver-nb" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.015013 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="dnsmasq-dns" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015018 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="dnsmasq-dns" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.015027 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015032 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.015038 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015044 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.015054 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015060 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.015070 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="init" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015075 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="init" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015255 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015270 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015284 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015300 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="38687559-9344-403e-9762-2a737c197c66" containerName="ovsdbserver-sb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015311 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" containerName="openstack-network-exporter" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015321 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="dnsmasq-dns" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.015331 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" containerName="ovsdbserver-nb" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.055111 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067255 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067311 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067358 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067444 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc\") pod \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067470 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067499 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config\") pod \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.067542 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq7bk\" (UniqueName: \"kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068281 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068476 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8knc\" (UniqueName: \"kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc\") pod \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068646 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb\") pod \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068842 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb\") pod \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\" (UID: \"2246ca28-9b3f-40d7-8d8c-4feb750d5d41\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.068918 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs\") pod \"ffece18b-f8fb-403a-8500-e00e53d9856e\" (UID: \"ffece18b-f8fb-403a-8500-e00e53d9856e\") " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.069375 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.069866 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.069882 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.069913 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f43d8157-c567-49db-a884-d09fd7cdeb0d\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.070013 4810 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.070075 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:11.070052701 +0000 UTC m=+8164.497303446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-scripts" not found Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.072288 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts" (OuterVolumeSpecName: "scripts") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.073962 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.077609 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk" (OuterVolumeSpecName: "kube-api-access-wq7bk") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "kube-api-access-wq7bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.080453 4810 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.080543 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:11.080521711 +0000 UTC m=+8164.507772446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-config-data" not found Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.082210 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.082262 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:11.082245847 +0000 UTC m=+8164.509496582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.084591 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config" (OuterVolumeSpecName: "config") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.109224 4810 scope.go:117] "RemoveContainer" containerID="38f4857d992be3b22e9531eec0f7a8496e516e8d7855074bc7163e25e6d53fa8" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.124588 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.106866 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.142613 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc" (OuterVolumeSpecName: "kube-api-access-w8knc") pod "2246ca28-9b3f-40d7-8d8c-4feb750d5d41" (UID: "2246ca28-9b3f-40d7-8d8c-4feb750d5d41"). InnerVolumeSpecName "kube-api-access-w8knc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.156866 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c783295d-2aa9-46e1-baad-eef94c60dc6f" (UID: "c783295d-2aa9-46e1-baad-eef94c60dc6f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177148 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177192 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177267 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177474 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq7bk\" (UniqueName: \"kubernetes.io/projected/ffece18b-f8fb-403a-8500-e00e53d9856e-kube-api-access-wq7bk\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177487 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8knc\" (UniqueName: \"kubernetes.io/projected/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-kube-api-access-w8knc\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177498 4810 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c783295d-2aa9-46e1-baad-eef94c60dc6f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177507 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177516 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffece18b-f8fb-403a-8500-e00e53d9856e-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.177524 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.187061 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.246956 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="galera" containerID="cri-o://cbb2088da894702fc6a881a5dab4338b4a03c2b8469ca40cdbf5ae5f3ce22b40" gracePeriod=29 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.282604 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.282655 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.282793 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.283740 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.284686 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.302146 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "pvc-84b38357-1dc1-44e4-8f01-39348df287e6". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.379312 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138a7b81-cd2a-4106-8743-b1520c597d0b" path="/var/lib/kubelet/pods/138a7b81-cd2a-4106-8743-b1520c597d0b/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.380768 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8c1d63-ceef-405e-8e0c-ac022d824c44" path="/var/lib/kubelet/pods/2a8c1d63-ceef-405e-8e0c-ac022d824c44/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.381456 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ba9c65-630f-4a79-9233-ad8cecbcbf40" path="/var/lib/kubelet/pods/41ba9c65-630f-4a79-9233-ad8cecbcbf40/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.382238 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4843e422-4c1a-41ed-9626-2593e9f064fd" path="/var/lib/kubelet/pods/4843e422-4c1a-41ed-9626-2593e9f064fd/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.383634 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e300415-0c77-4c7d-a35b-5bc2b5e6a01d" path="/var/lib/kubelet/pods/5e300415-0c77-4c7d-a35b-5bc2b5e6a01d/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.384239 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65d99cf4-6580-44c8-b7c5-807251deaf31" path="/var/lib/kubelet/pods/65d99cf4-6580-44c8-b7c5-807251deaf31/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.384781 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" path="/var/lib/kubelet/pods/c783295d-2aa9-46e1-baad-eef94c60dc6f/volumes" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.385985 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") on node \"crc\" " Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.433470 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj\") pod \"redhat-operators-8p5tt\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.473148 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.485503 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.506785 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.506866 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.525481 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.604876 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.646468 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2246ca28-9b3f-40d7-8d8c-4feb750d5d41" (UID: "2246ca28-9b3f-40d7-8d8c-4feb750d5d41"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.692293 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.692478 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-84b38357-1dc1-44e4-8f01-39348df287e6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6") on node "crc" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.707937 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.708542 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_ffece18b-f8fb-403a-8500-e00e53d9856e/ovsdbserver-sb/0.log" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.709152 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.711377 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.711435 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.711449 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-84b38357-1dc1-44e4-8f01-39348df287e6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84b38357-1dc1-44e4-8f01-39348df287e6\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.711545 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:07 crc kubenswrapper[4810]: E1003 09:12:07.711623 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data podName:e9436321-8243-4a4a-b07a-b14668a07f1f nodeName:}" failed. No retries permitted until 2025-10-03 09:12:11.711593679 +0000 UTC m=+8165.138844414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data") pod "rabbitmq-cell1-server-0" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f") : configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.729427 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2246ca28-9b3f-40d7-8d8c-4feb750d5d41" (UID: "2246ca28-9b3f-40d7-8d8c-4feb750d5d41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.741809 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "e5f34fe0-9224-4e20-ba36-8047c9fe43b0" (UID: "e5f34fe0-9224-4e20-ba36-8047c9fe43b0"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.747785 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerID="5440941186e8bdddd5f88722a1e8ed75a86f824038cb407b01408da32a0da878" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.747818 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerID="08cd416b8eeef505b3e0b38ba8da9d068bedaeea3f4d831442ca091390459861" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.765670 4810 generic.go:334] "Generic (PLEG): container finished" podID="3d48e153-4470-4082-8f04-319557247b18" containerID="a51f1a244e80ed64834a6d007023e154a4199448c67d8f2e29312bbb25ff2f8c" exitCode=143 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.793396 4810 generic.go:334] "Generic (PLEG): container finished" podID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerID="43e7dc7eb5e00cbb5009d4a66ffb6aa6e9fffc55143b955e5c66ac08125d7298" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.793431 4810 generic.go:334] "Generic (PLEG): container finished" podID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerID="66f1ba2e75970a1aba16f1afb2a73d7784ddebbe5e4717cbfe405fc0b9fc68dd" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.795206 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "38687559-9344-403e-9762-2a737c197c66" (UID: "38687559-9344-403e-9762-2a737c197c66"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.800151 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config" (OuterVolumeSpecName: "config") pod "2246ca28-9b3f-40d7-8d8c-4feb750d5d41" (UID: "2246ca28-9b3f-40d7-8d8c-4feb750d5d41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.821354 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2246ca28-9b3f-40d7-8d8c-4feb750d5d41" (UID: "2246ca28-9b3f-40d7-8d8c-4feb750d5d41"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.822160 4810 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.822195 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.822207 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/38687559-9344-403e-9762-2a737c197c66-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.822219 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5f34fe0-9224-4e20-ba36-8047c9fe43b0-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.822290 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2246ca28-9b3f-40d7-8d8c-4feb750d5d41-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.848151 4810 generic.go:334] "Generic (PLEG): container finished" podID="6545022e-2717-4dda-801e-d88c8b037558" containerID="60d21af689f98576a4bb81b1221ab2ccefd04ca65804bc0605a1a62f203b010e" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.852425 4810 generic.go:334] "Generic (PLEG): container finished" podID="99cbdb61-8043-480e-836c-876ea826c5c3" containerID="a140b7e92a1d9149e2d71e898a7a3c0826dc5bdb44363234e8c1a19baa36023f" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.852442 4810 generic.go:334] "Generic (PLEG): container finished" podID="99cbdb61-8043-480e-836c-876ea826c5c3" containerID="bfb8c46462d42f7cabbdb9785c5516092ad0178d1e18d3fc0e4dc555e3aec222" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.852449 4810 generic.go:334] "Generic (PLEG): container finished" podID="99cbdb61-8043-480e-836c-876ea826c5c3" containerID="695d874952bb7697f4f41bab1649fbdd13c3b210395aab381ef29bcdad672912" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.856464 4810 generic.go:334] "Generic (PLEG): container finished" podID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerID="a9eb5c4e834ec18df22b9dc93a6f7d4c539136ebfda5ef3d4283bbbf3ada1234" exitCode=143 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.863534 4810 generic.go:334] "Generic (PLEG): container finished" podID="a7e14d67-c732-445d-b7c7-ffb607ff1d3a" containerID="ea6defe8bf672751992101e754e07e04f6da09fc040a2ef4b978763f323a1cfa" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.869255 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.872143 4810 generic.go:334] "Generic (PLEG): container finished" podID="48f62813-7b25-45b6-9942-a5e54b40a235" containerID="7f5bf483256e9e40c501144201c3dc96463a540abcc9d8379933be6e0f86f926" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882216 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"ffece18b-f8fb-403a-8500-e00e53d9856e","Type":"ContainerDied","Data":"430dfc21522fc77148d10bdee3fb431fac9e74408515e2ce3900f2942cfa23d4"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882271 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerDied","Data":"5440941186e8bdddd5f88722a1e8ed75a86f824038cb407b01408da32a0da878"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882287 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerDied","Data":"08cd416b8eeef505b3e0b38ba8da9d068bedaeea3f4d831442ca091390459861"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882299 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerDied","Data":"a51f1a244e80ed64834a6d007023e154a4199448c67d8f2e29312bbb25ff2f8c"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882310 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerDied","Data":"43e7dc7eb5e00cbb5009d4a66ffb6aa6e9fffc55143b955e5c66ac08125d7298"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882326 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerDied","Data":"66f1ba2e75970a1aba16f1afb2a73d7784ddebbe5e4717cbfe405fc0b9fc68dd"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerDied","Data":"60d21af689f98576a4bb81b1221ab2ccefd04ca65804bc0605a1a62f203b010e"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882348 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerDied","Data":"a140b7e92a1d9149e2d71e898a7a3c0826dc5bdb44363234e8c1a19baa36023f"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerDied","Data":"bfb8c46462d42f7cabbdb9785c5516092ad0178d1e18d3fc0e4dc555e3aec222"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882367 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerDied","Data":"695d874952bb7697f4f41bab1649fbdd13c3b210395aab381ef29bcdad672912"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882377 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882390 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerDied","Data":"a9eb5c4e834ec18df22b9dc93a6f7d4c539136ebfda5ef3d4283bbbf3ada1234"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882401 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance35f0-account-delete-h8t2h" event={"ID":"a7e14d67-c732-445d-b7c7-ffb607ff1d3a","Type":"ContainerDied","Data":"ea6defe8bf672751992101e754e07e04f6da09fc040a2ef4b978763f323a1cfa"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882414 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance35f0-account-delete-h8t2h" event={"ID":"a7e14d67-c732-445d-b7c7-ffb607ff1d3a","Type":"ContainerDied","Data":"59f6d6e2b21e24b80a2054579b3e0b660ce498892c06041ea8112fc3afa8cd91"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882424 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f6d6e2b21e24b80a2054579b3e0b660ce498892c06041ea8112fc3afa8cd91" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.882434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerDied","Data":"7f5bf483256e9e40c501144201c3dc96463a540abcc9d8379933be6e0f86f926"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.920267 4810 generic.go:334] "Generic (PLEG): container finished" podID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerID="276347e95ea2f8ef1722075a72ac28686ed4623fefd6f74a753d70385f27ad2e" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.920310 4810 generic.go:334] "Generic (PLEG): container finished" podID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerID="ccaddef8fdc23da0618108ad3920e7dc36166714d51af18712d008fbb1ee8cf6" exitCode=0 Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.920400 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerDied","Data":"276347e95ea2f8ef1722075a72ac28686ed4623fefd6f74a753d70385f27ad2e"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.920431 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerDied","Data":"ccaddef8fdc23da0618108ad3920e7dc36166714d51af18712d008fbb1ee8cf6"} Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.922531 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.923911 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:07 crc kubenswrapper[4810]: I1003 09:12:07.981304 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ffece18b-f8fb-403a-8500-e00e53d9856e" (UID: "ffece18b-f8fb-403a-8500-e00e53d9856e"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.026320 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffece18b-f8fb-403a-8500-e00e53d9856e-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.026622 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" event={"ID":"2246ca28-9b3f-40d7-8d8c-4feb750d5d41","Type":"ContainerDied","Data":"a2ed6591046b98e0fc64eb8d188ab68ab0e1381294b139a1117c13cdb1624c76"} Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.026738 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.096915 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4b2d-account-delete-5zfwv" event={"ID":"a0ff0594-a3c0-4939-8113-d4611d0a53ce","Type":"ContainerStarted","Data":"094d459475b946fcaebb762c881b67487d2e509ed5632e02021af2452b9ed1ca"} Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.107133 4810 generic.go:334] "Generic (PLEG): container finished" podID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerID="787a3821a90f8b366d38035e7db82d3692844c42b7d2bf85811ba83389e1173a" exitCode=0 Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.107285 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerDied","Data":"787a3821a90f8b366d38035e7db82d3692844c42b7d2bf85811ba83389e1173a"} Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.134245 4810 generic.go:334] "Generic (PLEG): container finished" podID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerID="68ad4f5b62f1bf62badf506a3cc48bf9e7ddd204e527495d5adf65f34b954e77" exitCode=143 Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.134336 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerDied","Data":"68ad4f5b62f1bf62badf506a3cc48bf9e7ddd204e527495d5adf65f34b954e77"} Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.155706 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.173930 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.174126 4810 scope.go:117] "RemoveContainer" containerID="7bb028b7a5bb063820d9ba2accea2471b634b11658dc72ef39f28643120fa6d7" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.177169 4810 generic.go:334] "Generic (PLEG): container finished" podID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerID="ba8f23b00e574ed38b94b464a2550358de6f47b13a732b76199ab43ac8a36415" exitCode=143 Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.177282 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerDied","Data":"ba8f23b00e574ed38b94b464a2550358de6f47b13a732b76199ab43ac8a36415"} Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.185585 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.186332 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatd134-account-delete-5svfk" event={"ID":"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9","Type":"ContainerStarted","Data":"637325ce8356d26b4d2e07c5a427657664c3c75647073562d9639076406c8517"} Oct 03 09:12:08 crc kubenswrapper[4810]: W1003 09:12:08.216654 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0e6d09e_815b_4c16_ae3e_79550f72b352.slice/crio-8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387 WatchSource:0}: Error finding container 8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387: Status 404 returned error can't find the container with id 8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387 Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.330464 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.351970 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.358271 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.386808 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.411480 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.446399 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6dcd\" (UniqueName: \"kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd\") pod \"a7e14d67-c732-445d-b7c7-ffb607ff1d3a\" (UID: \"a7e14d67-c732-445d-b7c7-ffb607ff1d3a\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.483656 4810 scope.go:117] "RemoveContainer" containerID="6c91709fb1869d8c7dcdbb6f88a006eabccffee810f357690eff8446064838c1" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.509610 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd" (OuterVolumeSpecName: "kube-api-access-q6dcd") pod "a7e14d67-c732-445d-b7c7-ffb607ff1d3a" (UID: "a7e14d67-c732-445d-b7c7-ffb607ff1d3a"). InnerVolumeSpecName "kube-api-access-q6dcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.512094 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.532076 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551248 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551541 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551564 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551610 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551694 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551729 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x2rh\" (UniqueName: \"kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.551755 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle\") pod \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\" (UID: \"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf\") " Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.553197 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6dcd\" (UniqueName: \"kubernetes.io/projected/a7e14d67-c732-445d-b7c7-ffb607ff1d3a-kube-api-access-q6dcd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.555294 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.555808 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs" (OuterVolumeSpecName: "logs") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.560383 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.582762 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-844fb94945-h8dp6"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.595225 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.69:8776/healthcheck\": read tcp 10.217.0.2:55122->10.217.1.69:8776: read: connection reset by peer" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.601621 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.610684 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.618739 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts" (OuterVolumeSpecName: "scripts") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.619541 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh" (OuterVolumeSpecName: "kube-api-access-7x2rh") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "kube-api-access-7x2rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.655336 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.655389 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x2rh\" (UniqueName: \"kubernetes.io/projected/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-kube-api-access-7x2rh\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.655402 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.655415 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.760634 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.822187 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data" (OuterVolumeSpecName: "config-data") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.858835 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" (UID: "ad2fe8b2-c0a4-4839-bd1c-18e665a881cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.868938 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.868972 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:08 crc kubenswrapper[4810]: I1003 09:12:08.868984 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.083870 4810 scope.go:117] "RemoveContainer" containerID="9f40d85da6ecfa7849d414ba4e3f550a824b8bba334f771ee2af8a049f0905e5" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.110748 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.227171 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" event={"ID":"64afa73e-dc4b-4a80-9597-0c552e43f979","Type":"ContainerDied","Data":"fdad967f08f16f22a27f70ae2b7d9f46686c4a201b26d0d6859a9f12cbb4fe92"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.227536 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdad967f08f16f22a27f70ae2b7d9f46686c4a201b26d0d6859a9f12cbb4fe92" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.251531 4810 generic.go:334] "Generic (PLEG): container finished" podID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerID="e38e84bc6b6ee906a778b23d1848296550b447437e04a7e712e4ebf93b978f4a" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.251636 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerDied","Data":"e38e84bc6b6ee906a778b23d1848296550b447437e04a7e712e4ebf93b978f4a"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.251668 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" event={"ID":"7d4234c2-ea29-4c53-aea2-05cbaee65464","Type":"ContainerDied","Data":"65d9fd647735f56cf5b3bb1035d79865533360e329c4b709f523d19429d03972"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.251681 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65d9fd647735f56cf5b3bb1035d79865533360e329c4b709f523d19429d03972" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.260443 4810 generic.go:334] "Generic (PLEG): container finished" podID="0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" containerID="91ed729b353c6b324966ca1e7a28e60053f9e4bae84eac39f377089ebb2b27eb" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.260505 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatd134-account-delete-5svfk" event={"ID":"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9","Type":"ContainerDied","Data":"91ed729b353c6b324966ca1e7a28e60053f9e4bae84eac39f377089ebb2b27eb"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.282156 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbfbb-account-delete-nxskb" event={"ID":"e600d090-3ebc-4552-a5d6-0c3de54c5de3","Type":"ContainerStarted","Data":"a1ad0f30975672fd73c21e5b452e80a59bdd01a8e822d5e67b94668447466864"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.283430 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.283519 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.283560 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.283866 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284099 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284132 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mphm\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284197 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284292 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284342 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284374 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.284433 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config\") pod \"99cbdb61-8043-480e-836c-876ea826c5c3\" (UID: \"99cbdb61-8043-480e-836c-876ea826c5c3\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.294330 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.301863 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.306090 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.306099 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.307544 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm" (OuterVolumeSpecName: "kube-api-access-7mphm") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "kube-api-access-7mphm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.310972 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.311203 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config" (OuterVolumeSpecName: "config") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.314253 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.314292 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out" (OuterVolumeSpecName: "config-out") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.318074 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.346222 4810 scope.go:117] "RemoveContainer" containerID="c7d745f287846f068383eb0e24d4d5fcd3953008bfd3c8d8c56c312dfc2dfdb3" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.380525 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "pvc-69452521-9c79-47c6-8ea5-998f140a0d2a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.382869 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" path="/var/lib/kubelet/pods/2246ca28-9b3f-40d7-8d8c-4feb750d5d41/volumes" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.389031 4810 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.389093 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") on node \"crc\" " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.389105 4810 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/99cbdb61-8043-480e-836c-876ea826c5c3-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.389115 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mphm\" (UniqueName: \"kubernetes.io/projected/99cbdb61-8043-480e-836c-876ea826c5c3-kube-api-access-7mphm\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.389124 4810 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.392203 4810 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.392256 4810 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.392274 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.392288 4810 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.392304 4810 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/99cbdb61-8043-480e-836c-876ea826c5c3-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.394503 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38687559-9344-403e-9762-2a737c197c66" path="/var/lib/kubelet/pods/38687559-9344-403e-9762-2a737c197c66/volumes" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.395689 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f34fe0-9224-4e20-ba36-8047c9fe43b0" path="/var/lib/kubelet/pods/e5f34fe0-9224-4e20-ba36-8047c9fe43b0/volumes" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.397190 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffece18b-f8fb-403a-8500-e00e53d9856e" path="/var/lib/kubelet/pods/ffece18b-f8fb-403a-8500-e00e53d9856e/volumes" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.455764 4810 generic.go:334] "Generic (PLEG): container finished" podID="a0ff0594-a3c0-4939-8113-d4611d0a53ce" containerID="ce0bf123079fad6602177f98443c87294f51896340fb885890994936a2b8dca1" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.462576 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.462736 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-69452521-9c79-47c6-8ea5-998f140a0d2a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a") on node "crc" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.485676 4810 generic.go:334] "Generic (PLEG): container finished" podID="51261961-7cea-4ee8-8009-d0669f796caa" containerID="cbb2088da894702fc6a881a5dab4338b4a03c2b8469ca40cdbf5ae5f3ce22b40" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.491528 4810 generic.go:334] "Generic (PLEG): container finished" podID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerID="cdda3f94e72b2e0fc1363ad8dec85327b12a98239d858659e6c717c1fc77cf5f" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.493648 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-69452521-9c79-47c6-8ea5-998f140a0d2a\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.496464 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.516862 4810 generic.go:334] "Generic (PLEG): container finished" podID="990ef3ca-87e1-4b2b-8714-86263523425b" containerID="be1d419bdcaddfa86e01ac0347a626d35490d6b41d2b7976a512edfe5809af3c" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.523207 4810 generic.go:334] "Generic (PLEG): container finished" podID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerID="ba6cb84bebecb7683aa43366962d518dd88cdaed82bb6fafec8f5209c8020422" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.531648 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": read tcp 10.217.0.2:59448->10.217.1.110:8775: read: connection reset by peer" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.532057 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": read tcp 10.217.0.2:59438->10.217.1.110:8775: read: connection reset by peer" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.573979 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config" (OuterVolumeSpecName: "web-config") pod "99cbdb61-8043-480e-836c-876ea826c5c3" (UID: "99cbdb61-8043-480e-836c-876ea826c5c3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575001 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6011-account-delete-8mtn8" event={"ID":"699b1f34-ccf2-4905-886d-10afe56a758f","Type":"ContainerStarted","Data":"20a298b5beb5f87533c8b58a64b8c2100e2c6894a3174fbf1cbee1085ac7cc3a"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575065 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575082 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"99cbdb61-8043-480e-836c-876ea826c5c3","Type":"ContainerDied","Data":"d0093dd8aa49c6d6090c46f25a1f776c62911fbfaafc33c8a5a613f60e613848"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575100 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronf221-account-delete-pkq85" event={"ID":"b0e6d09e-815b-4c16-ae3e-79550f72b352","Type":"ContainerStarted","Data":"8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575135 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4b2d-account-delete-5zfwv" event={"ID":"a0ff0594-a3c0-4939-8113-d4611d0a53ce","Type":"ContainerDied","Data":"ce0bf123079fad6602177f98443c87294f51896340fb885890994936a2b8dca1"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575169 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerDied","Data":"cbb2088da894702fc6a881a5dab4338b4a03c2b8469ca40cdbf5ae5f3ce22b40"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575182 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerDied","Data":"cdda3f94e72b2e0fc1363ad8dec85327b12a98239d858659e6c717c1fc77cf5f"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575214 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7688d87799-z7jvj" event={"ID":"29e116f1-db5c-41a3-9a6e-03c09d7f8103","Type":"ContainerDied","Data":"70243bb2239e34b21a066d21957c415544fa02aff12a63e6940a9612cae5d09d"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575225 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70243bb2239e34b21a066d21957c415544fa02aff12a63e6940a9612cae5d09d" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575235 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ad2fe8b2-c0a4-4839-bd1c-18e665a881cf","Type":"ContainerDied","Data":"0f714d9ba4d18c319d6c7a89a36f7f6a124b1a686c26f0111a661be7af6a8c27"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.575250 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerDied","Data":"be1d419bdcaddfa86e01ac0347a626d35490d6b41d2b7976a512edfe5809af3c"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.577854 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0d0e8-account-delete-2dtvp" event={"ID":"19f98119-f52a-4cea-a154-7380157f3720","Type":"ContainerStarted","Data":"a23f84b8b8509a498fc3f706b0240cb8324e756a058be04f252c84f5cbc59b6e"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.579660 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe","Type":"ContainerDied","Data":"ba6cb84bebecb7683aa43366962d518dd88cdaed82bb6fafec8f5209c8020422"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.579695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe","Type":"ContainerDied","Data":"81434baf854e7a7e4e5ba310cc83eec8a78c224a00665c0f25b5b5ac10eee5b7"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.580942 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81434baf854e7a7e4e5ba310cc83eec8a78c224a00665c0f25b5b5ac10eee5b7" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.581303 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d","Type":"ContainerDied","Data":"203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.581318 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203bd90b8419d3a4101f8874f9b47c22321daf0d039b9473b1e88e319b9a9011" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.594659 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.598886 4810 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/99cbdb61-8043-480e-836c-876ea826c5c3-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.609249 4810 scope.go:117] "RemoveContainer" containerID="b15c68365da2964e4b50c0011f035005f70603978d31e1076761b396d98183c1" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.614146 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.618696 4810 generic.go:334] "Generic (PLEG): container finished" podID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerID="a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.618770 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerDied","Data":"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.641652 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerID="c978bb239f34435050aeea1f40571c0760c669b74d2a1948cabd533429556d2a" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.641707 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerDied","Data":"c978bb239f34435050aeea1f40571c0760c669b74d2a1948cabd533429556d2a"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.659291 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerDied","Data":"aafd95e2c2156c5d62c9e52e62267e00c8cc9c2c5da26c2f67f4a5b4754d0e28"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.659582 4810 generic.go:334] "Generic (PLEG): container finished" podID="14265138-5fbe-4187-928c-fbe84d832080" containerID="aafd95e2c2156c5d62c9e52e62267e00c8cc9c2c5da26c2f67f4a5b4754d0e28" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.670085 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.113:8774/\": read tcp 10.217.0.2:33162->10.217.1.113:8774: read: connection reset by peer" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.678084 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.113:8774/\": read tcp 10.217.0.2:33158->10.217.1.113:8774: read: connection reset by peer" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.692546 4810 generic.go:334] "Generic (PLEG): container finished" podID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerID="67b54925633fb7897de4f9db6a5d84178f3f191ee9b36f058f8d0c7e2ef38c22" exitCode=0 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.692660 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance35f0-account-delete-h8t2h" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.692947 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerDied","Data":"67b54925633fb7897de4f9db6a5d84178f3f191ee9b36f058f8d0c7e2ef38c22"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.692992 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b","Type":"ContainerDied","Data":"65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3"} Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.693007 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.704331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom\") pod \"95eba70b-a04a-4ca8-8e43-0ee212328321\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.711658 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle\") pod \"95eba70b-a04a-4ca8-8e43-0ee212328321\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.720128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95eba70b-a04a-4ca8-8e43-0ee212328321" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.726659 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "95eba70b-a04a-4ca8-8e43-0ee212328321" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.756607 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.756832 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerName="nova-scheduler-scheduler" containerID="cri-o://6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" gracePeriod=30 Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.776471 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.799792 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.816317 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.816353 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.854789 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.871675 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.915611 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917659 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917742 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8scx6\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917822 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917848 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917873 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917954 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.917993 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.918029 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs\") pod \"64afa73e-dc4b-4a80-9597-0c552e43f979\" (UID: \"64afa73e-dc4b-4a80-9597-0c552e43f979\") " Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.924379 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.924575 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.943084 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.954697 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.955107 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6" (OuterVolumeSpecName: "kube-api-access-8scx6") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "kube-api-access-8scx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.967122 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.973166 4810 scope.go:117] "RemoveContainer" containerID="82c2b517a15956667e2ff7d96e00a8894f23fd7255a9bfd77aa7585c60994331" Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.973635 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance35f0-account-delete-h8t2h"] Oct 03 09:12:09 crc kubenswrapper[4810]: I1003 09:12:09.989412 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025102 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2w2w\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs\") pod \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025624 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025673 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025708 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025734 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs\") pod \"7d4234c2-ea29-4c53-aea2-05cbaee65464\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025765 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out\") pod \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\" (UID: \"a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025844 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr7pb\" (UniqueName: \"kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb\") pod \"7d4234c2-ea29-4c53-aea2-05cbaee65464\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.025973 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom\") pod \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026012 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcf2d\" (UniqueName: \"kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d\") pod \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026623 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle\") pod \"7d4234c2-ea29-4c53-aea2-05cbaee65464\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026671 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data\") pod \"7d4234c2-ea29-4c53-aea2-05cbaee65464\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026727 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom\") pod \"7d4234c2-ea29-4c53-aea2-05cbaee65464\" (UID: \"7d4234c2-ea29-4c53-aea2-05cbaee65464\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026771 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data\") pod \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.026811 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle\") pod \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\" (UID: \"29e116f1-db5c-41a3-9a6e-03c09d7f8103\") " Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.027657 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8scx6\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-kube-api-access-8scx6\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.027683 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.027696 4810 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/64afa73e-dc4b-4a80-9597-0c552e43f979-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.027709 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/64afa73e-dc4b-4a80-9597-0c552e43f979-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.028770 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.032728 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs" (OuterVolumeSpecName: "logs") pod "29e116f1-db5c-41a3-9a6e-03c09d7f8103" (UID: "29e116f1-db5c-41a3-9a6e-03c09d7f8103"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.035330 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db" (OuterVolumeSpecName: "alertmanager-metric-storage-db") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "alertmanager-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.044971 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs" (OuterVolumeSpecName: "logs") pod "7d4234c2-ea29-4c53-aea2-05cbaee65464" (UID: "7d4234c2-ea29-4c53-aea2-05cbaee65464"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.049854 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out" (OuterVolumeSpecName: "config-out") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.058175 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.059841 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w" (OuterVolumeSpecName: "kube-api-access-d2w2w") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "kube-api-access-d2w2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.073203 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb" (OuterVolumeSpecName: "kube-api-access-hr7pb") pod "7d4234c2-ea29-4c53-aea2-05cbaee65464" (UID: "7d4234c2-ea29-4c53-aea2-05cbaee65464"). InnerVolumeSpecName "kube-api-access-hr7pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.074681 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7d4234c2-ea29-4c53-aea2-05cbaee65464" (UID: "7d4234c2-ea29-4c53-aea2-05cbaee65464"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.075159 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d" (OuterVolumeSpecName: "kube-api-access-rcf2d") pod "29e116f1-db5c-41a3-9a6e-03c09d7f8103" (UID: "29e116f1-db5c-41a3-9a6e-03c09d7f8103"). InnerVolumeSpecName "kube-api-access-rcf2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.080108 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.083376 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29e116f1-db5c-41a3-9a6e-03c09d7f8103" (UID: "29e116f1-db5c-41a3-9a6e-03c09d7f8103"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129515 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129560 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcf2d\" (UniqueName: \"kubernetes.io/projected/29e116f1-db5c-41a3-9a6e-03c09d7f8103-kube-api-access-rcf2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129574 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129586 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2w2w\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-kube-api-access-d2w2w\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129600 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29e116f1-db5c-41a3-9a6e-03c09d7f8103-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129613 4810 reconciler_common.go:293] "Volume detached for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-alertmanager-metric-storage-db\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129625 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129639 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d4234c2-ea29-4c53-aea2-05cbaee65464-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129651 4810 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129663 4810 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.129674 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr7pb\" (UniqueName: \"kubernetes.io/projected/7d4234c2-ea29-4c53-aea2-05cbaee65464-kube-api-access-hr7pb\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.138657 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.138923 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-central-agent" containerID="cri-o://2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.139268 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-notification-agent" containerID="cri-o://dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.139318 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="sg-core" containerID="cri-o://8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.139361 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="proxy-httpd" containerID="cri-o://dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.196482 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.196749 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" containerName="kube-state-metrics" containerID="cri-o://dbab8a7e512236638d4064d4a8006a3f143bbc0cfea49417d843d0140eb8cafb" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.346682 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-577984bbd4-45rx5" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.133:8000/healthcheck\": read tcp 10.217.0.2:35498->10.217.1.133:8000: read: connection reset by peer" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.370417 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29e116f1-db5c-41a3-9a6e-03c09d7f8103" (UID: "29e116f1-db5c-41a3-9a6e-03c09d7f8103"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.401494 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d4234c2-ea29-4c53-aea2-05cbaee65464" (UID: "7d4234c2-ea29-4c53-aea2-05cbaee65464"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.425678 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.437542 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6866556666-25fj4" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.1.48:9311/healthcheck\": dial tcp 10.217.1.48:9311: connect: connection refused" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.438088 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6866556666-25fj4" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.1.48:9311/healthcheck\": dial tcp 10.217.1.48:9311: connect: connection refused" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.446143 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.446173 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.446182 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.449335 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.485924 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data" (OuterVolumeSpecName: "config-data") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.550420 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.550451 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.551686 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64afa73e-dc4b-4a80-9597-0c552e43f979" (UID: "64afa73e-dc4b-4a80-9597-0c552e43f979"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.618319 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.618523 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="c56de889-4796-4d3c-87b0-faff35387c26" containerName="memcached" containerID="cri-o://211daa9ed516000f1909977a5746e579501a880585f3b68881c446794b0cd9ed" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.643790 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data" (OuterVolumeSpecName: "config-data") pod "7d4234c2-ea29-4c53-aea2-05cbaee65464" (UID: "7d4234c2-ea29-4c53-aea2-05cbaee65464"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.664482 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config" (OuterVolumeSpecName: "web-config") pod "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" (UID: "a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.682707 4810 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.682739 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64afa73e-dc4b-4a80-9597-0c552e43f979-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.682754 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d4234c2-ea29-4c53-aea2-05cbaee65464-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.722563 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.734440 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c96f8d7b-nv762" event={"ID":"14265138-5fbe-4187-928c-fbe84d832080","Type":"ContainerDied","Data":"dfad215d84e80e7be57e8b4eb52259b776b7a757a4e050faaf8e666cb4529c89"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.734480 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfad215d84e80e7be57e8b4eb52259b776b7a757a4e050faaf8e666cb4529c89" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.737438 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data" (OuterVolumeSpecName: "config-data") pod "29e116f1-db5c-41a3-9a6e-03c09d7f8103" (UID: "29e116f1-db5c-41a3-9a6e-03c09d7f8103"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.774574 4810 generic.go:334] "Generic (PLEG): container finished" podID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerID="a2c2611ee5c4b7a3eb513f4e9f6d3e9dfa093b9ff621e7dcd667a7d9a57b1f3e" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.774725 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d6cf467cf-6jvjp" event={"ID":"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293","Type":"ContainerDied","Data":"a2c2611ee5c4b7a3eb513f4e9f6d3e9dfa093b9ff621e7dcd667a7d9a57b1f3e"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.787143 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29e116f1-db5c-41a3-9a6e-03c09d7f8103-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.807536 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystoneb067-account-delete-b8gsz"] Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808046 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808062 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808080 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-server" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808087 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-server" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808103 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808111 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808124 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808133 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808148 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808156 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808180 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="init-config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808188 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="init-config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808224 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="alertmanager" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808232 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="alertmanager" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808245 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808252 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808267 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7e14d67-c732-445d-b7c7-ffb607ff1d3a" containerName="mariadb-account-delete" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808274 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7e14d67-c732-445d-b7c7-ffb607ff1d3a" containerName="mariadb-account-delete" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808284 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="prometheus" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808290 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="prometheus" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808301 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808306 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808317 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="init-config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808333 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="init-config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808342 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808350 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-log" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808363 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808369 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener-log" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808380 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808386 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker-log" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.808402 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="thanos-sidecar" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808408 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="thanos-sidecar" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808603 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808615 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-server" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808631 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7e14d67-c732-445d-b7c7-ffb607ff1d3a" containerName="mariadb-account-delete" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808642 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808656 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" containerName="barbican-keystone-listener-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808666 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808677 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" containerName="alertmanager" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808691 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-httpd" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808700 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="prometheus" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808709 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808714 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="thanos-sidecar" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808724 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" containerName="barbican-worker" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808732 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="config-reloader" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.808741 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" containerName="glance-log" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.809454 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.814093 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell13fd2-account-delete-ggrsp" event={"ID":"03d4344e-bfcb-42d1-8bcb-6924d24ae77b","Type":"ContainerStarted","Data":"f5e391710f681b3c8dc1e1f31390ba4212d1e600d3a0cb00683e292ee4bff9d0"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.817140 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.817563 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-b9848c8cf-kb9qj" podUID="e4be45a7-4886-405d-b10b-0519258715f6" containerName="keystone-api" containerID="cri-o://3db6c0f0fe1196ece0d1fd23d0a431452da6842720d8625cef9b225452286fdb" gracePeriod=30 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.829067 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystoneb067-account-delete-b8gsz"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.843828 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-cron-29324701-wvvbm"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.844739 4810 generic.go:334] "Generic (PLEG): container finished" podID="e600d090-3ebc-4552-a5d6-0c3de54c5de3" containerID="6c7a7c61b7e6749bd7310b1f5a4503ed0964eabf688b3f9038e92660a8c8e783" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.844807 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbfbb-account-delete-nxskb" event={"ID":"e600d090-3ebc-4552-a5d6-0c3de54c5de3","Type":"ContainerDied","Data":"6c7a7c61b7e6749bd7310b1f5a4503ed0964eabf688b3f9038e92660a8c8e783"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.849163 4810 generic.go:334] "Generic (PLEG): container finished" podID="463a9578-5093-4812-a27c-c4d849a5ec67" containerID="180bad1c50d897c2a6c43df0a1afbfa7d6c5a60011b7aa487c86d10db9b1980d" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.849208 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerDied","Data":"180bad1c50d897c2a6c43df0a1afbfa7d6c5a60011b7aa487c86d10db9b1980d"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.861034 4810 generic.go:334] "Generic (PLEG): container finished" podID="19f98119-f52a-4cea-a154-7380157f3720" containerID="45a1515c2acae09dfee0ed6214a4b847e7576925c5609e90af82c1960e172c1a" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.861145 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0d0e8-account-delete-2dtvp" event={"ID":"19f98119-f52a-4cea-a154-7380157f3720","Type":"ContainerDied","Data":"45a1515c2acae09dfee0ed6214a4b847e7576925c5609e90af82c1960e172c1a"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.870996 4810 generic.go:334] "Generic (PLEG): container finished" podID="6545022e-2717-4dda-801e-d88c8b037558" containerID="592213b9cf15535204f9f07e1afe93eab58cb7c7e9c4720ea154fc4abf864d38" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.871059 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerDied","Data":"592213b9cf15535204f9f07e1afe93eab58cb7c7e9c4720ea154fc4abf864d38"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.873053 4810 generic.go:334] "Generic (PLEG): container finished" podID="b0e6d09e-815b-4c16-ae3e-79550f72b352" containerID="4a43af4fc1e60e505d5beb2a5ed97dbb636ccb1deeb1559ec83a92891659f7d8" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.873326 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronf221-account-delete-pkq85" event={"ID":"b0e6d09e-815b-4c16-ae3e-79550f72b352","Type":"ContainerDied","Data":"4a43af4fc1e60e505d5beb2a5ed97dbb636ccb1deeb1559ec83a92891659f7d8"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.892500 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") pod \"keystoneb067-account-delete-b8gsz\" (UID: \"406a32fa-cd77-42ae-9239-80be47525eba\") " pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.893793 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-cron-29324701-wvvbm"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.899042 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.905249 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystoneb067-account-delete-b8gsz"] Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.918463 4810 generic.go:334] "Generic (PLEG): container finished" podID="64def3d7-6342-432a-a0c2-b562b7514bca" containerID="dbab8a7e512236638d4064d4a8006a3f143bbc0cfea49417d843d0140eb8cafb" exitCode=2 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.918560 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64def3d7-6342-432a-a0c2-b562b7514bca","Type":"ContainerDied","Data":"dbab8a7e512236638d4064d4a8006a3f143bbc0cfea49417d843d0140eb8cafb"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.961273 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"51261961-7cea-4ee8-8009-d0669f796caa","Type":"ContainerDied","Data":"9770605882dda004ff7db2eefb6ee2e01136630ad05f1e87740139db27d2c8df"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.961546 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9770605882dda004ff7db2eefb6ee2e01136630ad05f1e87740139db27d2c8df" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.970236 4810 generic.go:334] "Generic (PLEG): container finished" podID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerID="2ba2dbc8167fc1bd6f1b2a1f51b4d2aeb50ad0c06902101cb2b2d5cf0dcefff2" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.970335 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerDied","Data":"2ba2dbc8167fc1bd6f1b2a1f51b4d2aeb50ad0c06902101cb2b2d5cf0dcefff2"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.972977 4810 generic.go:334] "Generic (PLEG): container finished" podID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerID="f0b610453374d95fa8d8b8835a740b4722435108e83542f0e16bb20b2adca98f" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.973249 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-577984bbd4-45rx5" event={"ID":"d03f8837-7028-4fa8-b8e2-9e12a35037c5","Type":"ContainerDied","Data":"f0b610453374d95fa8d8b8835a740b4722435108e83542f0e16bb20b2adca98f"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.982104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heatd134-account-delete-5svfk" event={"ID":"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9","Type":"ContainerDied","Data":"637325ce8356d26b4d2e07c5a427657664c3c75647073562d9639076406c8517"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.982141 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637325ce8356d26b4d2e07c5a427657664c3c75647073562d9639076406c8517" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.988038 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder4b2d-account-delete-5zfwv" event={"ID":"a0ff0594-a3c0-4939-8113-d4611d0a53ce","Type":"ContainerDied","Data":"094d459475b946fcaebb762c881b67487d2e509ed5632e02021af2452b9ed1ca"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.988086 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="094d459475b946fcaebb762c881b67487d2e509ed5632e02021af2452b9ed1ca" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.992743 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerID="dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.992770 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerID="8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318" exitCode=2 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.992811 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerDied","Data":"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.992834 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerDied","Data":"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318"} Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.993947 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.994042 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") pod \"keystoneb067-account-delete-b8gsz\" (UID: \"406a32fa-cd77-42ae-9239-80be47525eba\") " pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.994121 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.994191 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data podName:849f27b3-b111-4722-9e85-528f2fbed78d nodeName:}" failed. No retries permitted until 2025-10-03 09:12:18.994173762 +0000 UTC m=+8172.421424497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data") pod "rabbitmq-server-0" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d") : configmap "rabbitmq-config-data" not found Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.994310 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:10 crc kubenswrapper[4810]: E1003 09:12:10.994428 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:18.994410368 +0000 UTC m=+8172.421661103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.996636 4810 generic.go:334] "Generic (PLEG): container finished" podID="3d48e153-4470-4082-8f04-319557247b18" containerID="4458f484a54244dae918accd79a7327d261f442db2d6d5731024ddb8dc509061" exitCode=0 Oct 03 09:12:10 crc kubenswrapper[4810]: I1003 09:12:10.996693 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerDied","Data":"4458f484a54244dae918accd79a7327d261f442db2d6d5731024ddb8dc509061"} Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.001488 4810 projected.go:194] Error preparing data for projected volume kube-api-access-56h88 for pod openstack/keystoneb067-account-delete-b8gsz: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.001560 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88 podName:406a32fa-cd77-42ae-9239-80be47525eba nodeName:}" failed. No retries permitted until 2025-10-03 09:12:11.501541249 +0000 UTC m=+8164.928791984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-56h88" (UniqueName: "kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88") pod "keystoneb067-account-delete-b8gsz" (UID: "406a32fa-cd77-42ae-9239-80be47525eba") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.009780 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"990ef3ca-87e1-4b2b-8714-86263523425b","Type":"ContainerDied","Data":"6d50228b624939992d60712869eaa0248b1a78dbd9158d13f41f238c53adc4c4"} Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.009823 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d50228b624939992d60712869eaa0248b1a78dbd9158d13f41f238c53adc4c4" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.019226 4810 generic.go:334] "Generic (PLEG): container finished" podID="699b1f34-ccf2-4905-886d-10afe56a758f" containerID="e40f124aa7b826bdb9776fd908df831935f9c765b9d0d3fec510488278301c94" exitCode=0 Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.019354 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6011-account-delete-8mtn8" event={"ID":"699b1f34-ccf2-4905-886d-10afe56a758f","Type":"ContainerDied","Data":"e40f124aa7b826bdb9776fd908df831935f9c765b9d0d3fec510488278301c94"} Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.020922 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4 is running failed: container process not found" containerID="3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.021486 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4 is running failed: container process not found" containerID="3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.022086 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4 is running failed: container process not found" containerID="3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.022121 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerName="nova-cell0-conductor-conductor" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.022849 4810 generic.go:334] "Generic (PLEG): container finished" podID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerID="3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" exitCode=0 Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.022928 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72f95e65-4a0d-4f33-b977-abca4f20aeef","Type":"ContainerDied","Data":"3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4"} Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.024913 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerStarted","Data":"17621b6427bae8344b55e256af9f0a6d165ee530a41dd7483c1ce325ca42d88d"} Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.030629 4810 generic.go:334] "Generic (PLEG): container finished" podID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerID="bef0d553e5791a6b7a41534f43da71d6aedce548e058b8e26b6a986f52a3616f" exitCode=0 Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.030904 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.040065 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerDied","Data":"bef0d553e5791a6b7a41534f43da71d6aedce548e058b8e26b6a986f52a3616f"} Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.040184 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7688d87799-z7jvj" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.040565 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.045098 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.046352 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.096587 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.097078 4810 secret.go:188] Couldn't get secret openstack/placement-config-data: secret "placement-config-data" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.097204 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:19.097147531 +0000 UTC m=+8172.524398266 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-config-data" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.098045 4810 secret.go:188] Couldn't get secret openstack/placement-scripts: secret "placement-scripts" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.098072 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts podName:14265138-5fbe-4187-928c-fbe84d832080 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:19.098064246 +0000 UTC m=+8172.525314981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts") pod "placement-c96f8d7b-nv762" (UID: "14265138-5fbe-4187-928c-fbe84d832080") : secret "placement-scripts" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.102393 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.102445 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:19.102435173 +0000 UTC m=+8172.529685908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.145377 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="galera" containerID="cri-o://69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" gracePeriod=30 Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.352123 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="c56de889-4796-4d3c-87b0-faff35387c26" containerName="memcached" probeResult="failure" output="dial tcp 10.217.1.1:11211: connect: connection refused" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.380404 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fa4b6d-43d9-4b34-a264-80a71cfcdb50" path="/var/lib/kubelet/pods/84fa4b6d-43d9-4b34-a264-80a71cfcdb50/volumes" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.381401 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" path="/var/lib/kubelet/pods/99cbdb61-8043-480e-836c-876ea826c5c3/volumes" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.382326 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7e14d67-c732-445d-b7c7-ffb607ff1d3a" path="/var/lib/kubelet/pods/a7e14d67-c732-445d-b7c7-ffb607ff1d3a/volumes" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.383653 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2fe8b2-c0a4-4839-bd1c-18e665a881cf" path="/var/lib/kubelet/pods/ad2fe8b2-c0a4-4839-bd1c-18e665a881cf/volumes" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.399541 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407148 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-846jc\" (UniqueName: \"kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407235 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407384 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407433 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407487 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407741 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.407803 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs\") pod \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\" (UID: \"8d1d96d1-04f9-4dd6-aabc-5a55b291d47b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.410346 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs" (OuterVolumeSpecName: "logs") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.410642 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.419947 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc" (OuterVolumeSpecName: "kube-api-access-846jc") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "kube-api-access-846jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.423237 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts" (OuterVolumeSpecName: "scripts") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.430379 4810 scope.go:117] "RemoveContainer" containerID="a140b7e92a1d9149e2d71e898a7a3c0826dc5bdb44363234e8c1a19baa36023f" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.471439 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.513649 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") pod \"keystoneb067-account-delete-b8gsz\" (UID: \"406a32fa-cd77-42ae-9239-80be47525eba\") " pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.513981 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.513997 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.514009 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.514021 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-846jc\" (UniqueName: \"kubernetes.io/projected/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-kube-api-access-846jc\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.514032 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.522199 4810 projected.go:194] Error preparing data for projected volume kube-api-access-56h88 for pod openstack/keystoneb067-account-delete-b8gsz: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.522279 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88 podName:406a32fa-cd77-42ae-9239-80be47525eba nodeName:}" failed. No retries permitted until 2025-10-03 09:12:12.522252772 +0000 UTC m=+8165.949503507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-56h88" (UniqueName: "kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88") pod "keystoneb067-account-delete-b8gsz" (UID: "406a32fa-cd77-42ae-9239-80be47525eba") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.525924 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-844fb94945-h8dp6" podUID="2246ca28-9b3f-40d7-8d8c-4feb750d5d41" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.112:5353: i/o timeout" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.585481 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data" (OuterVolumeSpecName: "config-data") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.597454 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" (UID: "8d1d96d1-04f9-4dd6-aabc-5a55b291d47b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.616645 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.616692 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.719505 4810 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.719588 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data podName:e9436321-8243-4a4a-b07a-b14668a07f1f nodeName:}" failed. No retries permitted until 2025-10-03 09:12:19.71956612 +0000 UTC m=+8173.146816925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data") pod "rabbitmq-cell1-server-0" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f") : configmap "rabbitmq-cell1-config-data" not found Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.741329 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.754144 4810 scope.go:117] "RemoveContainer" containerID="bfb8c46462d42f7cabbdb9785c5516092ad0178d1e18d3fc0e4dc555e3aec222" Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.760101 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.766594 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.768173 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.770828 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.770887 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.777807 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.780634 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.792305 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.813413 4810 scope.go:117] "RemoveContainer" containerID="695d874952bb7697f4f41bab1649fbdd13c3b210395aab381ef29bcdad672912" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.821001 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7688d87799-z7jvj"] Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.825231 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.830912 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.834928 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.857371 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.858711 4810 scope.go:117] "RemoveContainer" containerID="f13a52947ceffedfa962780acfef9809f64ca3164e93bfaa9673bc9fcd4a3572" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.871866 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="99cbdb61-8043-480e-836c-876ea826c5c3" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.1.146:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.881949 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:12:11 crc kubenswrapper[4810]: E1003 09:12:11.907160 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-56h88], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystoneb067-account-delete-b8gsz" podUID="406a32fa-cd77-42ae-9239-80be47525eba" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.918102 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925242 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925305 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs\") pod \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925340 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925362 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925402 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925440 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle\") pod \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925466 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925577 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925643 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs\") pod \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925684 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925718 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925757 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925779 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data\") pod \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925803 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925844 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925871 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925945 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925976 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcfvc\" (UniqueName: \"kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.925997 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.926021 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzrq\" (UniqueName: \"kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq\") pod \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\" (UID: \"abef8580-e6ef-4e8a-887d-bdf45e9a5bbe\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.926053 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnjx\" (UniqueName: \"kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.926076 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.927216 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs" (OuterVolumeSpecName: "logs") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.927266 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.928417 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.930988 4810 scope.go:117] "RemoveContainer" containerID="787a3821a90f8b366d38035e7db82d3692844c42b7d2bf85811ba83389e1173a" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.934427 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.936828 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.937858 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.941079 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.941209 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.949971 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.953035 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq" (OuterVolumeSpecName: "kube-api-access-gmzrq") pod "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" (UID: "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe"). InnerVolumeSpecName "kube-api-access-gmzrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.957143 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx" (OuterVolumeSpecName: "kube-api-access-nrnjx") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "kube-api-access-nrnjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.960041 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc" (OuterVolumeSpecName: "kube-api-access-bcfvc") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "kube-api-access-bcfvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.960313 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 09:12:11 crc kubenswrapper[4810]: I1003 09:12:11.961802 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets" (OuterVolumeSpecName: "secrets") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.001431 4810 scope.go:117] "RemoveContainer" containerID="8e41db6129ebb440e14b354aafb38602fc26eb206790f69566dcdbdb50d588a8" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.005732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts" (OuterVolumeSpecName: "scripts") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.024106 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" (UID: "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.027643 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9gp\" (UniqueName: \"kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028234 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028270 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028294 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs\") pod \"463a9578-5093-4812-a27c-c4d849a5ec67\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028314 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrsvz\" (UniqueName: \"kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz\") pod \"463a9578-5093-4812-a27c-c4d849a5ec67\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028337 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsjkr\" (UniqueName: \"kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr\") pod \"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9\" (UID: \"0ecc84ff-bf6e-49da-b3e7-24f595f52cd9\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028365 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028445 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028469 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028491 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.028567 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029311 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029376 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029397 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029424 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029445 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029464 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9ct\" (UniqueName: \"kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct\") pod \"a0ff0594-a3c0-4939-8113-d4611d0a53ce\" (UID: \"a0ff0594-a3c0-4939-8113-d4611d0a53ce\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029479 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.029502 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle\") pod \"463a9578-5093-4812-a27c-c4d849a5ec67\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.030309 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs\") pod \"3d48e153-4470-4082-8f04-319557247b18\" (UID: \"3d48e153-4470-4082-8f04-319557247b18\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.030569 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.030602 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data\") pod \"463a9578-5093-4812-a27c-c4d849a5ec67\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.030632 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psh77\" (UniqueName: \"kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.037135 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbqkf\" (UniqueName: \"kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf\") pod \"14265138-5fbe-4187-928c-fbe84d832080\" (UID: \"14265138-5fbe-4187-928c-fbe84d832080\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.037208 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs\") pod \"463a9578-5093-4812-a27c-c4d849a5ec67\" (UID: \"463a9578-5093-4812-a27c-c4d849a5ec67\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.037976 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcfvc\" (UniqueName: \"kubernetes.io/projected/990ef3ca-87e1-4b2b-8714-86263523425b-kube-api-access-bcfvc\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.037991 4810 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038002 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzrq\" (UniqueName: \"kubernetes.io/projected/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-kube-api-access-gmzrq\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038011 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnjx\" (UniqueName: \"kubernetes.io/projected/51261961-7cea-4ee8-8009-d0669f796caa-kube-api-access-nrnjx\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038020 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/990ef3ca-87e1-4b2b-8714-86263523425b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038027 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038035 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038044 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038054 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038062 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/990ef3ca-87e1-4b2b-8714-86263523425b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038071 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038082 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51261961-7cea-4ee8-8009-d0669f796caa-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038090 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/51261961-7cea-4ee8-8009-d0669f796caa-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.038853 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs" (OuterVolumeSpecName: "logs") pod "463a9578-5093-4812-a27c-c4d849a5ec67" (UID: "463a9578-5093-4812-a27c-c4d849a5ec67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.053633 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs" (OuterVolumeSpecName: "logs") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.054731 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs" (OuterVolumeSpecName: "logs") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.090601 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbfbb-account-delete-nxskb" event={"ID":"e600d090-3ebc-4552-a5d6-0c3de54c5de3","Type":"ContainerDied","Data":"a1ad0f30975672fd73c21e5b452e80a59bdd01a8e822d5e67b94668447466864"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.090643 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1ad0f30975672fd73c21e5b452e80a59bdd01a8e822d5e67b94668447466864" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.093780 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts" (OuterVolumeSpecName: "scripts") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.094078 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"463a9578-5093-4812-a27c-c4d849a5ec67","Type":"ContainerDied","Data":"4214e7f40b81374bd21871ca80061b3198f6b89f231c5f4fdfcc4d7db8cfc465"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.094112 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.096441 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"64def3d7-6342-432a-a0c2-b562b7514bca","Type":"ContainerDied","Data":"04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.096481 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f0e802330f0812c592325e4b1b86aba633e825fd8083df9f1d21f12ca5a530" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.099402 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf" (OuterVolumeSpecName: "kube-api-access-nbqkf") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "kube-api-access-nbqkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.100145 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77" (OuterVolumeSpecName: "kube-api-access-psh77") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "kube-api-access-psh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.100804 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0d0e8-account-delete-2dtvp" event={"ID":"19f98119-f52a-4cea-a154-7380157f3720","Type":"ContainerDied","Data":"a23f84b8b8509a498fc3f706b0240cb8324e756a058be04f252c84f5cbc59b6e"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.100873 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23f84b8b8509a498fc3f706b0240cb8324e756a058be04f252c84f5cbc59b6e" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.100985 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts" (OuterVolumeSpecName: "scripts") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.101051 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz" (OuterVolumeSpecName: "kube-api-access-lrsvz") pod "463a9578-5093-4812-a27c-c4d849a5ec67" (UID: "463a9578-5093-4812-a27c-c4d849a5ec67"). InnerVolumeSpecName "kube-api-access-lrsvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.101659 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.102423 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr" (OuterVolumeSpecName: "kube-api-access-tsjkr") pod "0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" (UID: "0ecc84ff-bf6e-49da-b3e7-24f595f52cd9"). InnerVolumeSpecName "kube-api-access-tsjkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.104152 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-d6cf467cf-6jvjp" event={"ID":"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293","Type":"ContainerDied","Data":"396790b96bf82f48099bdd7be3654af1eda2c75580e145bcd422441e0776cb96"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.104465 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="396790b96bf82f48099bdd7be3654af1eda2c75580e145bcd422441e0776cb96" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.104414 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct" (OuterVolumeSpecName: "kube-api-access-tw9ct") pod "a0ff0594-a3c0-4939-8113-d4611d0a53ce" (UID: "a0ff0594-a3c0-4939-8113-d4611d0a53ce"). InnerVolumeSpecName "kube-api-access-tw9ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.106065 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp" (OuterVolumeSpecName: "kube-api-access-cd9gp") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "kube-api-access-cd9gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.123456 4810 generic.go:334] "Generic (PLEG): container finished" podID="03d4344e-bfcb-42d1-8bcb-6924d24ae77b" containerID="c03970b52fc6ca7025b62de785b1139fa830c15bd6e21199d278ece8a17a9f93" exitCode=1 Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.123536 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell13fd2-account-delete-ggrsp" event={"ID":"03d4344e-bfcb-42d1-8bcb-6924d24ae77b","Type":"ContainerDied","Data":"c03970b52fc6ca7025b62de785b1139fa830c15bd6e21199d278ece8a17a9f93"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.199654 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.200725 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nk28d\" (UniqueName: \"kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.200825 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201034 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201115 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201230 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201328 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201420 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201582 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201675 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201747 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.201916 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzd5q\" (UniqueName: \"kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q\") pod \"36616a3d-b630-49fe-80a4-5e024f5f575f\" (UID: \"36616a3d-b630-49fe-80a4-5e024f5f575f\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.202002 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts\") pod \"6545022e-2717-4dda-801e-d88c8b037558\" (UID: \"6545022e-2717-4dda-801e-d88c8b037558\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.202699 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.202823 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d48e153-4470-4082-8f04-319557247b18-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205253 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205355 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14265138-5fbe-4187-928c-fbe84d832080-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205462 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9ct\" (UniqueName: \"kubernetes.io/projected/a0ff0594-a3c0-4939-8113-d4611d0a53ce-kube-api-access-tw9ct\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205524 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psh77\" (UniqueName: \"kubernetes.io/projected/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-kube-api-access-psh77\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.206089 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbqkf\" (UniqueName: \"kubernetes.io/projected/14265138-5fbe-4187-928c-fbe84d832080-kube-api-access-nbqkf\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.206315 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463a9578-5093-4812-a27c-c4d849a5ec67-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.206416 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd9gp\" (UniqueName: \"kubernetes.io/projected/3d48e153-4470-4082-8f04-319557247b18-kube-api-access-cd9gp\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.206507 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.225007 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrsvz\" (UniqueName: \"kubernetes.io/projected/463a9578-5093-4812-a27c-c4d849a5ec67-kube-api-access-lrsvz\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.225051 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsjkr\" (UniqueName: \"kubernetes.io/projected/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9-kube-api-access-tsjkr\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.210254 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs" (OuterVolumeSpecName: "logs") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.210481 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.211247 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205284 4810 generic.go:334] "Generic (PLEG): container finished" podID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerID="e23cb5ec44af577e88e6c384a440a4a119a181d40b53ef85b850150499c498d8" exitCode=0 Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.205312 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerDied","Data":"e23cb5ec44af577e88e6c384a440a4a119a181d40b53ef85b850150499c498d8"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.220088 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" (UID: "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.230369 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-577984bbd4-45rx5" event={"ID":"d03f8837-7028-4fa8-b8e2-9e12a35037c5","Type":"ContainerDied","Data":"eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.230415 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eca781eff596eb97c1221712fe3b1795ad4f319bf8f61527b9fdb0d4cdd60b0c" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.235580 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6545022e-2717-4dda-801e-d88c8b037558","Type":"ContainerDied","Data":"0026cbfd4bc90a0bd014e4886b14e2539b70a78feaada6ac4f06b620a695d53a"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.235657 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.240086 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts" (OuterVolumeSpecName: "scripts") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.243177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d" (OuterVolumeSpecName: "kube-api-access-nk28d") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "kube-api-access-nk28d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.243787 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q" (OuterVolumeSpecName: "kube-api-access-rzd5q") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "kube-api-access-rzd5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.243980 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"72f95e65-4a0d-4f33-b977-abca4f20aeef","Type":"ContainerDied","Data":"343a8f5e3e000ebfa8bc8688340b12ec57a2289f81805634ddaa24931011bdb5"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.244021 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="343a8f5e3e000ebfa8bc8688340b12ec57a2289f81805634ddaa24931011bdb5" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.244521 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.245709 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6866556666-25fj4" event={"ID":"36616a3d-b630-49fe-80a4-5e024f5f575f","Type":"ContainerDied","Data":"c02443ef2ea493851553b7fde9dbf8dff0cc4ff927abad827210275f8ab476a6"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.245948 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6866556666-25fj4" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.247732 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.252793 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3d48e153-4470-4082-8f04-319557247b18","Type":"ContainerDied","Data":"28e7e44846cdd8453c36b8be13314a2d7a56d1b1d0b054b8b1ad8927942162af"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.252909 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.259257 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e podName:51261961-7cea-4ee8-8009-d0669f796caa nodeName:}" failed. No retries permitted until 2025-10-03 09:12:12.759237938 +0000 UTC m=+8166.186488673 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "mysql-db" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.273179 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican6011-account-delete-8mtn8" event={"ID":"699b1f34-ccf2-4905-886d-10afe56a758f","Type":"ContainerDied","Data":"20a298b5beb5f87533c8b58a64b8c2100e2c6894a3174fbf1cbee1085ac7cc3a"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.273220 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20a298b5beb5f87533c8b58a64b8c2100e2c6894a3174fbf1cbee1085ac7cc3a" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.283096 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.285347 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8a54421e-d5ab-4e38-a6da-d90860a2b3b3","Type":"ContainerDied","Data":"dc36f29d4851ccdd10540d1952a6da61ea70f4d8b54c849a51e8c9416da5792b"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.285441 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.287822 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.292126 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerID="2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8" exitCode=0 Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.292328 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerDied","Data":"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.294397 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronf221-account-delete-pkq85" event={"ID":"b0e6d09e-815b-4c16-ae3e-79550f72b352","Type":"ContainerDied","Data":"8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.294771 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e476cb5622964df82253520a30e7adcb2fb1843a477b2038c88162c76302387" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.296706 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.300432 4810 generic.go:334] "Generic (PLEG): container finished" podID="c56de889-4796-4d3c-87b0-faff35387c26" containerID="211daa9ed516000f1909977a5746e579501a880585f3b68881c446794b0cd9ed" exitCode=0 Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.300631 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.301562 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c56de889-4796-4d3c-87b0-faff35387c26","Type":"ContainerDied","Data":"211daa9ed516000f1909977a5746e579501a880585f3b68881c446794b0cd9ed"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.301591 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c56de889-4796-4d3c-87b0-faff35387c26","Type":"ContainerDied","Data":"c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84"} Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.301603 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c947dfcfec2923c5b92adf998ecde1100fdf28fc9c04b86c7cf91de99da8ef84" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.301638 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heatd134-account-delete-5svfk" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.300538 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder4b2d-account-delete-5zfwv" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.303113 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.303531 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.311228 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.311231 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.311803 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c96f8d7b-nv762" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.323540 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.325450 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.325516 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerName="nova-scheduler-scheduler" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327295 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327437 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327453 4810 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6545022e-2717-4dda-801e-d88c8b037558-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327466 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzd5q\" (UniqueName: \"kubernetes.io/projected/36616a3d-b630-49fe-80a4-5e024f5f575f-kube-api-access-rzd5q\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327477 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327490 4810 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327503 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327515 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327526 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nk28d\" (UniqueName: \"kubernetes.io/projected/6545022e-2717-4dda-801e-d88c8b037558-kube-api-access-nk28d\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.327537 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36616a3d-b630-49fe-80a4-5e024f5f575f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.374671 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463a9578-5093-4812-a27c-c4d849a5ec67" (UID: "463a9578-5093-4812-a27c-c4d849a5ec67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.375198 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data" (OuterVolumeSpecName: "config-data") pod "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" (UID: "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.376376 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.429697 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.429940 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.430067 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.459106 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data" (OuterVolumeSpecName: "config-data") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.481211 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.481324 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.484205 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.489911 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.490312 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.500165 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.154:3000/\": dial tcp 10.217.1.154:3000: connect: connection refused" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.533379 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") pod \"keystoneb067-account-delete-b8gsz\" (UID: \"406a32fa-cd77-42ae-9239-80be47525eba\") " pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.533485 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.533496 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.538477 4810 projected.go:194] Error preparing data for projected volume kube-api-access-56h88 for pod openstack/keystoneb067-account-delete-b8gsz: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.538558 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88 podName:406a32fa-cd77-42ae-9239-80be47525eba nodeName:}" failed. No retries permitted until 2025-10-03 09:12:14.538535186 +0000 UTC m=+8167.965785921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-56h88" (UniqueName: "kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88") pod "keystoneb067-account-delete-b8gsz" (UID: "406a32fa-cd77-42ae-9239-80be47525eba") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.562372 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.564257 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.4:5671: connect: connection refused" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.579433 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.587927 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.591337 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.5:5671: connect: connection refused" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.606479 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data" (OuterVolumeSpecName: "config-data") pod "463a9578-5093-4812-a27c-c4d849a5ec67" (UID: "463a9578-5093-4812-a27c-c4d849a5ec67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.618046 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.638469 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.638540 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.638554 4810 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/51261961-7cea-4ee8-8009-d0669f796caa-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.638569 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.638582 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.682131 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.685825 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.697878 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.728054 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data" (OuterVolumeSpecName: "config-data") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.741947 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data" (OuterVolumeSpecName: "config-data") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.742926 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") pod \"990ef3ca-87e1-4b2b-8714-86263523425b\" (UID: \"990ef3ca-87e1-4b2b-8714-86263523425b\") " Oct 03 09:12:12 crc kubenswrapper[4810]: W1003 09:12:12.743092 4810 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/990ef3ca-87e1-4b2b-8714-86263523425b/volumes/kubernetes.io~secret/config-data Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743111 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data" (OuterVolumeSpecName: "config-data") pod "990ef3ca-87e1-4b2b-8714-86263523425b" (UID: "990ef3ca-87e1-4b2b-8714-86263523425b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743435 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743456 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743464 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/990ef3ca-87e1-4b2b-8714-86263523425b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743474 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.743483 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.754204 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data" (OuterVolumeSpecName: "config-data") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.763225 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.769668 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.785912 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.800342 4810 scope.go:117] "RemoveContainer" containerID="180bad1c50d897c2a6c43df0a1afbfa7d6c5a60011b7aa487c86d10db9b1980d" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.817623 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.845825 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.853214 4810 scope.go:117] "RemoveContainer" containerID="c0451198ffb49cf3efcc829fbf4f4b7d23054fd0f095c80c41379ab2c12fc9df" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.860548 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "463a9578-5093-4812-a27c-c4d849a5ec67" (UID: "463a9578-5093-4812-a27c-c4d849a5ec67"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.868286 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" (UID: "abef8580-e6ef-4e8a-887d-bdf45e9a5bbe"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.868464 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.869838 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: E1003 09:12:12.876811 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1d96d1_04f9_4dd6_aabc_5a55b291d47b.slice/crio-65b845554a1583b514248264c81c62266f15b84561d0fa8db84ce822860fbde3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ecc84ff_bf6e_49da_b3e7_24f595f52cd9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d1d96d1_04f9_4dd6_aabc_5a55b291d47b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0ff0594_a3c0_4939_8113_d4611d0a53ce.slice/crio-094d459475b946fcaebb762c881b67487d2e509ed5632e02021af2452b9ed1ca\": RecentStats: unable to find data in memory cache]" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.885366 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") pod \"51261961-7cea-4ee8-8009-d0669f796caa\" (UID: \"51261961-7cea-4ee8-8009-d0669f796caa\") " Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.889548 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.889809 4810 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/463a9578-5093-4812-a27c-c4d849a5ec67-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.889877 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.889909 4810 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.889959 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.891791 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heatd134-account-delete-5svfk"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.893583 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.903128 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.907688 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.913458 4810 scope.go:117] "RemoveContainer" containerID="60d21af689f98576a4bb81b1221ab2ccefd04ca65804bc0605a1a62f203b010e" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.915557 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder4b2d-account-delete-5zfwv"] Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.948422 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.949280 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e" (OuterVolumeSpecName: "mysql-db") pod "51261961-7cea-4ee8-8009-d0669f796caa" (UID: "51261961-7cea-4ee8-8009-d0669f796caa"). InnerVolumeSpecName "pvc-b8689870-6384-481d-b268-4dddad8a462e". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.963222 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:12 crc kubenswrapper[4810]: I1003 09:12:12.996189 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3d48e153-4470-4082-8f04-319557247b18" (UID: "3d48e153-4470-4082-8f04-319557247b18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000274 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config\") pod \"64def3d7-6342-432a-a0c2-b562b7514bca\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000346 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000396 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle\") pod \"64def3d7-6342-432a-a0c2-b562b7514bca\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000423 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvtgt\" (UniqueName: \"kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt\") pod \"19f98119-f52a-4cea-a154-7380157f3720\" (UID: \"19f98119-f52a-4cea-a154-7380157f3720\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000487 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcdzj\" (UniqueName: \"kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000530 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs\") pod \"64def3d7-6342-432a-a0c2-b562b7514bca\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000621 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data\") pod \"72f95e65-4a0d-4f33-b977-abca4f20aeef\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000662 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000689 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000728 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000764 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000815 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000862 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f26jm\" (UniqueName: \"kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm\") pod \"64def3d7-6342-432a-a0c2-b562b7514bca\" (UID: \"64def3d7-6342-432a-a0c2-b562b7514bca\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000913 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000942 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbf77\" (UniqueName: \"kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77\") pod \"b0e6d09e-815b-4c16-ae3e-79550f72b352\" (UID: \"b0e6d09e-815b-4c16-ae3e-79550f72b352\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.000984 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001044 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001070 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle\") pod \"72f95e65-4a0d-4f33-b977-abca4f20aeef\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001116 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-294xk\" (UniqueName: \"kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk\") pod \"e600d090-3ebc-4552-a5d6-0c3de54c5de3\" (UID: \"e600d090-3ebc-4552-a5d6-0c3de54c5de3\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001185 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001243 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data\") pod \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\" (UID: \"d03f8837-7028-4fa8-b8e2-9e12a35037c5\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.001330 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ss5j\" (UniqueName: \"kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j\") pod \"72f95e65-4a0d-4f33-b977-abca4f20aeef\" (UID: \"72f95e65-4a0d-4f33-b977-abca4f20aeef\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.002648 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") on node \"crc\" " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.002684 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d48e153-4470-4082-8f04-319557247b18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.002695 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.015549 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.021004 4810 scope.go:117] "RemoveContainer" containerID="592213b9cf15535204f9f07e1afe93eab58cb7c7e9c4720ea154fc4abf864d38" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.031646 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.039082 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.072579 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.075608 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj" (OuterVolumeSpecName: "kube-api-access-jcdzj") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "kube-api-access-jcdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.089482 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data" (OuterVolumeSpecName: "config-data") pod "6545022e-2717-4dda-801e-d88c8b037558" (UID: "6545022e-2717-4dda-801e-d88c8b037558"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.090252 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk" (OuterVolumeSpecName: "kube-api-access-294xk") pod "e600d090-3ebc-4552-a5d6-0c3de54c5de3" (UID: "e600d090-3ebc-4552-a5d6-0c3de54c5de3"). InnerVolumeSpecName "kube-api-access-294xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.090737 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j" (OuterVolumeSpecName: "kube-api-access-6ss5j") pod "72f95e65-4a0d-4f33-b977-abca4f20aeef" (UID: "72f95e65-4a0d-4f33-b977-abca4f20aeef"). InnerVolumeSpecName "kube-api-access-6ss5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.097953 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.098254 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77" (OuterVolumeSpecName: "kube-api-access-rbf77") pod "b0e6d09e-815b-4c16-ae3e-79550f72b352" (UID: "b0e6d09e-815b-4c16-ae3e-79550f72b352"). InnerVolumeSpecName "kube-api-access-rbf77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.104082 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.105987 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.109320 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs" (OuterVolumeSpecName: "kube-api-access-hpqrs") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "kube-api-access-hpqrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.112234 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115234 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config\") pod \"c56de889-4796-4d3c-87b0-faff35387c26\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115303 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs\") pod \"c56de889-4796-4d3c-87b0-faff35387c26\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115339 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfhrg\" (UniqueName: \"kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg\") pod \"699b1f34-ccf2-4905-886d-10afe56a758f\" (UID: \"699b1f34-ccf2-4905-886d-10afe56a758f\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115417 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") pod \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\" (UID: \"8a54421e-d5ab-4e38-a6da-d90860a2b3b3\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115449 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data\") pod \"c56de889-4796-4d3c-87b0-faff35387c26\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115497 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle\") pod \"c56de889-4796-4d3c-87b0-faff35387c26\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115598 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq4th\" (UniqueName: \"kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th\") pod \"03d4344e-bfcb-42d1-8bcb-6924d24ae77b\" (UID: \"03d4344e-bfcb-42d1-8bcb-6924d24ae77b\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115625 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm492\" (UniqueName: \"kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492\") pod \"c56de889-4796-4d3c-87b0-faff35387c26\" (UID: \"c56de889-4796-4d3c-87b0-faff35387c26\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.115643 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") pod \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\" (UID: \"8cbd704a-5b8a-47e7-a5f9-7c6a20e01293\") " Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116277 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcdzj\" (UniqueName: \"kubernetes.io/projected/d03f8837-7028-4fa8-b8e2-9e12a35037c5-kube-api-access-jcdzj\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116297 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6545022e-2717-4dda-801e-d88c8b037558-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116308 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbf77\" (UniqueName: \"kubernetes.io/projected/b0e6d09e-815b-4c16-ae3e-79550f72b352-kube-api-access-rbf77\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116316 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116326 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-294xk\" (UniqueName: \"kubernetes.io/projected/e600d090-3ebc-4552-a5d6-0c3de54c5de3-kube-api-access-294xk\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116335 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ss5j\" (UniqueName: \"kubernetes.io/projected/72f95e65-4a0d-4f33-b977-abca4f20aeef-kube-api-access-6ss5j\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: W1003 09:12:13.116424 4810 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293/volumes/kubernetes.io~projected/kube-api-access-hpqrs Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116504 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs" (OuterVolumeSpecName: "kube-api-access-hpqrs") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "kube-api-access-hpqrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.116570 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.119221 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c56de889-4796-4d3c-87b0-faff35387c26" (UID: "c56de889-4796-4d3c-87b0-faff35387c26"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: W1003 09:12:13.119332 4810 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/8a54421e-d5ab-4e38-a6da-d90860a2b3b3/volumes/kubernetes.io~secret/combined-ca-bundle Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.119349 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.122167 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data" (OuterVolumeSpecName: "config-data") pod "c56de889-4796-4d3c-87b0-faff35387c26" (UID: "c56de889-4796-4d3c-87b0-faff35387c26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.128326 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th" (OuterVolumeSpecName: "kube-api-access-sq4th") pod "03d4344e-bfcb-42d1-8bcb-6924d24ae77b" (UID: "03d4344e-bfcb-42d1-8bcb-6924d24ae77b"). InnerVolumeSpecName "kube-api-access-sq4th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.148019 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492" (OuterVolumeSpecName: "kube-api-access-wm492") pod "c56de889-4796-4d3c-87b0-faff35387c26" (UID: "c56de889-4796-4d3c-87b0-faff35387c26"). InnerVolumeSpecName "kube-api-access-wm492". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.152345 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm" (OuterVolumeSpecName: "kube-api-access-f26jm") pod "64def3d7-6342-432a-a0c2-b562b7514bca" (UID: "64def3d7-6342-432a-a0c2-b562b7514bca"). InnerVolumeSpecName "kube-api-access-f26jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.152651 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg" (OuterVolumeSpecName: "kube-api-access-hfhrg") pod "699b1f34-ccf2-4905-886d-10afe56a758f" (UID: "699b1f34-ccf2-4905-886d-10afe56a758f"). InnerVolumeSpecName "kube-api-access-hfhrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.166754 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt" (OuterVolumeSpecName: "kube-api-access-pvtgt") pod "19f98119-f52a-4cea-a154-7380157f3720" (UID: "19f98119-f52a-4cea-a154-7380157f3720"). InnerVolumeSpecName "kube-api-access-pvtgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.168061 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "36616a3d-b630-49fe-80a4-5e024f5f575f" (UID: "36616a3d-b630-49fe-80a4-5e024f5f575f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229611 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229654 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229667 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvtgt\" (UniqueName: \"kubernetes.io/projected/19f98119-f52a-4cea-a154-7380157f3720-kube-api-access-pvtgt\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229684 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq4th\" (UniqueName: \"kubernetes.io/projected/03d4344e-bfcb-42d1-8bcb-6924d24ae77b-kube-api-access-sq4th\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229697 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpqrs\" (UniqueName: \"kubernetes.io/projected/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-kube-api-access-hpqrs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229709 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm492\" (UniqueName: \"kubernetes.io/projected/c56de889-4796-4d3c-87b0-faff35387c26-kube-api-access-wm492\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229720 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229731 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f26jm\" (UniqueName: \"kubernetes.io/projected/64def3d7-6342-432a-a0c2-b562b7514bca-kube-api-access-f26jm\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229742 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c56de889-4796-4d3c-87b0-faff35387c26-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229753 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36616a3d-b630-49fe-80a4-5e024f5f575f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.229768 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfhrg\" (UniqueName: \"kubernetes.io/projected/699b1f34-ccf2-4905-886d-10afe56a758f-kube-api-access-hfhrg\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.268727 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.278687 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.290258 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.290451 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b8689870-6384-481d-b268-4dddad8a462e" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e") on node "crc" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.310338 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.318790 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data" (OuterVolumeSpecName: "config-data") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.332672 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.332713 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-b8689870-6384-481d-b268-4dddad8a462e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8689870-6384-481d-b268-4dddad8a462e\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.332730 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.348296 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" path="/var/lib/kubelet/pods/0ecc84ff-bf6e-49da-b3e7-24f595f52cd9/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.348906 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e116f1-db5c-41a3-9a6e-03c09d7f8103" path="/var/lib/kubelet/pods/29e116f1-db5c-41a3-9a6e-03c09d7f8103/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.349530 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" path="/var/lib/kubelet/pods/8d1d96d1-04f9-4dd6-aabc-5a55b291d47b/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.351889 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" path="/var/lib/kubelet/pods/990ef3ca-87e1-4b2b-8714-86263523425b/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.353545 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data" (OuterVolumeSpecName: "config-data") pod "72f95e65-4a0d-4f33-b977-abca4f20aeef" (UID: "72f95e65-4a0d-4f33-b977-abca4f20aeef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.353968 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ff0594-a3c0-4939-8113-d4611d0a53ce" path="/var/lib/kubelet/pods/a0ff0594-a3c0-4939-8113-d4611d0a53ce/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.354838 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d" path="/var/lib/kubelet/pods/a2c3e51f-abd0-4a2f-870f-ad7a5d16c03d/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.357043 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" path="/var/lib/kubelet/pods/abef8580-e6ef-4e8a-887d-bdf45e9a5bbe/volumes" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.391449 4810 generic.go:334] "Generic (PLEG): container finished" podID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerID="3e4b15b2571b4b9e95c498a522370b026aa306f3f2b72b93a885f270f401f309" exitCode=0 Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.400177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72f95e65-4a0d-4f33-b977-abca4f20aeef" (UID: "72f95e65-4a0d-4f33-b977-abca4f20aeef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.400217 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14265138-5fbe-4187-928c-fbe84d832080" (UID: "14265138-5fbe-4187-928c-fbe84d832080"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.407874 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell13fd2-account-delete-ggrsp" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.420009 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.420852 4810 generic.go:334] "Generic (PLEG): container finished" podID="849f27b3-b111-4722-9e85-528f2fbed78d" containerID="5b4764a5166f67c1b430dfc61b76795284f3ddc58b5b1b1ce0e10d2c554b70f0" exitCode=0 Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.425710 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.425924 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426144 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426249 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-577984bbd4-45rx5" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426288 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronf221-account-delete-pkq85" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426335 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0d0e8-account-delete-2dtvp" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426382 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican6011-account-delete-8mtn8" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426670 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426704 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementbfbb-account-delete-nxskb" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.426746 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-d6cf467cf-6jvjp" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.440398 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.440430 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.440439 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14265138-5fbe-4187-928c-fbe84d832080-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.440449 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f95e65-4a0d-4f33-b977-abca4f20aeef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.450424 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "64def3d7-6342-432a-a0c2-b562b7514bca" (UID: "64def3d7-6342-432a-a0c2-b562b7514bca"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.474315 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.503882 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c56de889-4796-4d3c-87b0-faff35387c26" (UID: "c56de889-4796-4d3c-87b0-faff35387c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.505205 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" (UID: "8cbd704a-5b8a-47e7-a5f9-7c6a20e01293"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.549905 4810 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.549944 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.549964 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.549982 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.550342 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.587701 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data" (OuterVolumeSpecName: "config-data") pod "8a54421e-d5ab-4e38-a6da-d90860a2b3b3" (UID: "8a54421e-d5ab-4e38-a6da-d90860a2b3b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.591172 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64def3d7-6342-432a-a0c2-b562b7514bca" (UID: "64def3d7-6342-432a-a0c2-b562b7514bca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.604015 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "64def3d7-6342-432a-a0c2-b562b7514bca" (UID: "64def3d7-6342-432a-a0c2-b562b7514bca"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.609401 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "c56de889-4796-4d3c-87b0-faff35387c26" (UID: "c56de889-4796-4d3c-87b0-faff35387c26"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.623981 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.642461 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data" (OuterVolumeSpecName: "config-data") pod "d03f8837-7028-4fa8-b8e2-9e12a35037c5" (UID: "d03f8837-7028-4fa8-b8e2-9e12a35037c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.651954 4810 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56de889-4796-4d3c-87b0-faff35387c26-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.651999 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.652012 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.652028 4810 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/64def3d7-6342-432a-a0c2-b562b7514bca-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.652043 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a54421e-d5ab-4e38-a6da-d90860a2b3b3-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.652054 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.652065 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d03f8837-7028-4fa8-b8e2-9e12a35037c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.721071 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerDied","Data":"3e4b15b2571b4b9e95c498a522370b026aa306f3f2b72b93a885f270f401f309"} Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.721112 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell13fd2-account-delete-ggrsp" event={"ID":"03d4344e-bfcb-42d1-8bcb-6924d24ae77b","Type":"ContainerDied","Data":"f5e391710f681b3c8dc1e1f31390ba4212d1e600d3a0cb00683e292ee4bff9d0"} Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.721132 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerDied","Data":"5b4764a5166f67c1b430dfc61b76795284f3ddc58b5b1b1ce0e10d2c554b70f0"} Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.721143 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"849f27b3-b111-4722-9e85-528f2fbed78d","Type":"ContainerDied","Data":"a3f7f18a094e7d135f475e1d019258020270f8f6abe77f3020c3cdfa60c2d04f"} Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.721152 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3f7f18a094e7d135f475e1d019258020270f8f6abe77f3020c3cdfa60c2d04f" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.734232 4810 scope.go:117] "RemoveContainer" containerID="bef0d553e5791a6b7a41534f43da71d6aedce548e058b8e26b6a986f52a3616f" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.894082 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.937391 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.60:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.937746 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.1.60:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.942251 4810 scope.go:117] "RemoveContainer" containerID="68ad4f5b62f1bf62badf506a3cc48bf9e7ddd204e527495d5adf65f34b954e77" Oct 03 09:12:13 crc kubenswrapper[4810]: I1003 09:12:13.968565 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.008445 4810 scope.go:117] "RemoveContainer" containerID="4458f484a54244dae918accd79a7327d261f442db2d6d5731024ddb8dc509061" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.059424 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.059783 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.062184 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.062290 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.062962 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.063113 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.063255 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.063331 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsl8h\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.064147 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.064566 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.065157 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.065458 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.065483 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068589 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068653 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068677 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068694 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjt2s\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068733 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068782 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068805 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068830 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068851 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068880 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068910 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd\") pod \"849f27b3-b111-4722-9e85-528f2fbed78d\" (UID: \"849f27b3-b111-4722-9e85-528f2fbed78d\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.068949 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret\") pod \"e9436321-8243-4a4a-b07a-b14668a07f1f\" (UID: \"e9436321-8243-4a4a-b07a-b14668a07f1f\") " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.069559 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.069581 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.069592 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.074258 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.074665 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.075088 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.078689 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info" (OuterVolumeSpecName: "pod-info") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.081089 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.083692 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h" (OuterVolumeSpecName: "kube-api-access-fsl8h") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "kube-api-access-fsl8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.086963 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.087069 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s" (OuterVolumeSpecName: "kube-api-access-tjt2s") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "kube-api-access-tjt2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.094270 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.094355 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info" (OuterVolumeSpecName: "pod-info") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.106671 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.117619 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0" (OuterVolumeSpecName: "persistence") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.137232 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180" (OuterVolumeSpecName: "persistence") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174522 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9436321-8243-4a4a-b07a-b14668a07f1f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174555 4810 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174569 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/849f27b3-b111-4722-9e85-528f2fbed78d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174578 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjt2s\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-kube-api-access-tjt2s\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174587 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174596 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174604 4810 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/849f27b3-b111-4722-9e85-528f2fbed78d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174613 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174622 4810 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9436321-8243-4a4a-b07a-b14668a07f1f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174631 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174660 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") on node \"crc\" " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174671 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsl8h\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-kube-api-access-fsl8h\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.174687 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") on node \"crc\" " Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.188558 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data" (OuterVolumeSpecName: "config-data") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.274358 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf" (OuterVolumeSpecName: "server-conf") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.276435 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.276458 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9436321-8243-4a4a-b07a-b14668a07f1f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.279476 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf" (OuterVolumeSpecName: "server-conf") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.326051 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.347401 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.347588 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180") on node "crc" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.350056 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data" (OuterVolumeSpecName: "config-data") pod "849f27b3-b111-4722-9e85-528f2fbed78d" (UID: "849f27b3-b111-4722-9e85-528f2fbed78d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.355851 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.356042 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0") on node "crc" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.378518 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.378552 4810 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/849f27b3-b111-4722-9e85-528f2fbed78d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.378562 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/849f27b3-b111-4722-9e85-528f2fbed78d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.378574 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07d5f873-7cd4-41d6-bc13-43b2740c5180\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.378585 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-328b680d-c2e3-4c30-9888-0a19e5506cf0\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.420822 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e9436321-8243-4a4a-b07a-b14668a07f1f" (UID: "e9436321-8243-4a4a-b07a-b14668a07f1f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.467490 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.482001 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.483142 4810 scope.go:117] "RemoveContainer" containerID="a51f1a244e80ed64834a6d007023e154a4199448c67d8f2e29312bbb25ff2f8c" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.495011 4810 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9436321-8243-4a4a-b07a-b14668a07f1f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.505465 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.523486 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.537676 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.598856 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") pod \"keystoneb067-account-delete-b8gsz\" (UID: \"406a32fa-cd77-42ae-9239-80be47525eba\") " pod="openstack/keystoneb067-account-delete-b8gsz" Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.621418 4810 projected.go:194] Error preparing data for projected volume kube-api-access-56h88 for pod openstack/keystoneb067-account-delete-b8gsz: failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.621491 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88 podName:406a32fa-cd77-42ae-9239-80be47525eba nodeName:}" failed. No retries permitted until 2025-10-03 09:12:18.621471048 +0000 UTC m=+8172.048721783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-56h88" (UniqueName: "kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88") pod "keystoneb067-account-delete-b8gsz" (UID: "406a32fa-cd77-42ae-9239-80be47525eba") : failed to fetch token: serviceaccounts "galera-openstack" not found Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.631046 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.704962 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.710635 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.745730 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.751124 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell13fd2-account-delete-ggrsp"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.751166 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e9436321-8243-4a4a-b07a-b14668a07f1f","Type":"ContainerDied","Data":"46f8e2d9a7f0ec2ad3caba55ca139b608a726f8949b644c7431e5c40575e10e5"} Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.765904 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.767492 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.780059 4810 scope.go:117] "RemoveContainer" containerID="2ba2dbc8167fc1bd6f1b2a1f51b4d2aeb50ad0c06902101cb2b2d5cf0dcefff2" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.818665 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.856199 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:14 crc kubenswrapper[4810]: E1003 09:12:14.856280 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.864867 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerStarted","Data":"3ef0c263e1c8f4ced443d655ece4132e18690001db86325aea223aa3d58b358e"} Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.867960 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.919387 4810 generic.go:334] "Generic (PLEG): container finished" podID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerID="6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" exitCode=0 Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.919464 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1ba8eff-5520-4703-911a-992a05fc8bdc","Type":"ContainerDied","Data":"6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834"} Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.919505 4810 scope.go:117] "RemoveContainer" containerID="c978bb239f34435050aeea1f40571c0760c669b74d2a1948cabd533429556d2a" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.920205 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementbfbb-account-delete-nxskb"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.946720 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.948057 4810 generic.go:334] "Generic (PLEG): container finished" podID="e4be45a7-4886-405d-b10b-0519258715f6" containerID="3db6c0f0fe1196ece0d1fd23d0a431452da6842720d8625cef9b225452286fdb" exitCode=0 Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.948213 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.961063 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9848c8cf-kb9qj" event={"ID":"e4be45a7-4886-405d-b10b-0519258715f6","Type":"ContainerDied","Data":"3db6c0f0fe1196ece0d1fd23d0a431452da6842720d8625cef9b225452286fdb"} Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.961356 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c96f8d7b-nv762"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.971995 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.982635 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican6011-account-delete-8mtn8"] Oct 03 09:12:14 crc kubenswrapper[4810]: I1003 09:12:14.991198 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.021972 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0d0e8-account-delete-2dtvp"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.036268 4810 scope.go:117] "RemoveContainer" containerID="5440941186e8bdddd5f88722a1e8ed75a86f824038cb407b01408da32a0da878" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.040694 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.054445 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.061703 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.070592 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6866556666-25fj4"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.076805 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.091804 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronf221-account-delete-pkq85"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.116161 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystoneb067-account-delete-b8gsz"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.124695 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystoneb067-account-delete-b8gsz"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.133381 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.139079 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-577984bbd4-45rx5"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.142413 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.144580 4810 scope.go:117] "RemoveContainer" containerID="08cd416b8eeef505b3e0b38ba8da9d068bedaeea3f4d831442ca091390459861" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.147228 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.156052 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: E1003 09:12:15.166564 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.166810 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: E1003 09:12:15.171525 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.173266 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: E1003 09:12:15.177058 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 03 09:12:15 crc kubenswrapper[4810]: E1003 09:12:15.177186 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="galera" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.182044 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.188821 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.195265 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.202212 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-d6cf467cf-6jvjp"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.219133 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.225459 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data\") pod \"e1ba8eff-5520-4703-911a-992a05fc8bdc\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.225622 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle\") pod \"e1ba8eff-5520-4703-911a-992a05fc8bdc\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.225749 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22tll\" (UniqueName: \"kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll\") pod \"e1ba8eff-5520-4703-911a-992a05fc8bdc\" (UID: \"e1ba8eff-5520-4703-911a-992a05fc8bdc\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.226284 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56h88\" (UniqueName: \"kubernetes.io/projected/406a32fa-cd77-42ae-9239-80be47525eba-kube-api-access-56h88\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.246648 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll" (OuterVolumeSpecName: "kube-api-access-22tll") pod "e1ba8eff-5520-4703-911a-992a05fc8bdc" (UID: "e1ba8eff-5520-4703-911a-992a05fc8bdc"). InnerVolumeSpecName "kube-api-access-22tll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.253490 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.265800 4810 scope.go:117] "RemoveContainer" containerID="c03970b52fc6ca7025b62de785b1139fa830c15bd6e21199d278ece8a17a9f93" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.272080 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.285917 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.290304 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data" (OuterVolumeSpecName: "config-data") pod "e1ba8eff-5520-4703-911a-992a05fc8bdc" (UID: "e1ba8eff-5520-4703-911a-992a05fc8bdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.291177 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ba8eff-5520-4703-911a-992a05fc8bdc" (UID: "e1ba8eff-5520-4703-911a-992a05fc8bdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.329778 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d4344e-bfcb-42d1-8bcb-6924d24ae77b" path="/var/lib/kubelet/pods/03d4344e-bfcb-42d1-8bcb-6924d24ae77b/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.330344 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14265138-5fbe-4187-928c-fbe84d832080" path="/var/lib/kubelet/pods/14265138-5fbe-4187-928c-fbe84d832080/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.330823 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f98119-f52a-4cea-a154-7380157f3720" path="/var/lib/kubelet/pods/19f98119-f52a-4cea-a154-7380157f3720/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.331771 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.331789 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22tll\" (UniqueName: \"kubernetes.io/projected/e1ba8eff-5520-4703-911a-992a05fc8bdc-kube-api-access-22tll\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.331799 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ba8eff-5520-4703-911a-992a05fc8bdc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.334709 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" path="/var/lib/kubelet/pods/36616a3d-b630-49fe-80a4-5e024f5f575f/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.335745 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d48e153-4470-4082-8f04-319557247b18" path="/var/lib/kubelet/pods/3d48e153-4470-4082-8f04-319557247b18/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.336278 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="406a32fa-cd77-42ae-9239-80be47525eba" path="/var/lib/kubelet/pods/406a32fa-cd77-42ae-9239-80be47525eba/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.336664 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" path="/var/lib/kubelet/pods/463a9578-5093-4812-a27c-c4d849a5ec67/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.338033 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51261961-7cea-4ee8-8009-d0669f796caa" path="/var/lib/kubelet/pods/51261961-7cea-4ee8-8009-d0669f796caa/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.338653 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" path="/var/lib/kubelet/pods/64def3d7-6342-432a-a0c2-b562b7514bca/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.340503 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6545022e-2717-4dda-801e-d88c8b037558" path="/var/lib/kubelet/pods/6545022e-2717-4dda-801e-d88c8b037558/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.341340 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699b1f34-ccf2-4905-886d-10afe56a758f" path="/var/lib/kubelet/pods/699b1f34-ccf2-4905-886d-10afe56a758f/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.341966 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" path="/var/lib/kubelet/pods/72f95e65-4a0d-4f33-b977-abca4f20aeef/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.343453 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" path="/var/lib/kubelet/pods/849f27b3-b111-4722-9e85-528f2fbed78d/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.344163 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" path="/var/lib/kubelet/pods/8a54421e-d5ab-4e38-a6da-d90860a2b3b3/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.345286 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" path="/var/lib/kubelet/pods/8cbd704a-5b8a-47e7-a5f9-7c6a20e01293/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.346781 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0e6d09e-815b-4c16-ae3e-79550f72b352" path="/var/lib/kubelet/pods/b0e6d09e-815b-4c16-ae3e-79550f72b352/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.347439 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56de889-4796-4d3c-87b0-faff35387c26" path="/var/lib/kubelet/pods/c56de889-4796-4d3c-87b0-faff35387c26/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.348019 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" path="/var/lib/kubelet/pods/d03f8837-7028-4fa8-b8e2-9e12a35037c5/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.349503 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e600d090-3ebc-4552-a5d6-0c3de54c5de3" path="/var/lib/kubelet/pods/e600d090-3ebc-4552-a5d6-0c3de54c5de3/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.350365 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" path="/var/lib/kubelet/pods/e9436321-8243-4a4a-b07a-b14668a07f1f/volumes" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.353144 4810 scope.go:117] "RemoveContainer" containerID="3e4b15b2571b4b9e95c498a522370b026aa306f3f2b72b93a885f270f401f309" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.385984 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.393024 4810 scope.go:117] "RemoveContainer" containerID="4c78e1c7563ba76c46be5ea7699ccecaad95d9a403d18c473a5a4f7527d3cda2" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.473914 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.535413 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data\") pod \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.535459 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xmgq\" (UniqueName: \"kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq\") pod \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.535627 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle\") pod \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\" (UID: \"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.540833 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq" (OuterVolumeSpecName: "kube-api-access-5xmgq") pod "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" (UID: "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef"). InnerVolumeSpecName "kube-api-access-5xmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.600550 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" (UID: "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.601088 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data" (OuterVolumeSpecName: "config-data") pod "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" (UID: "eb12a3c2-3c6e-437a-ac06-021b2c18c4ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637469 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvr8v\" (UniqueName: \"kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637549 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637768 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637850 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637928 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.637959 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.638629 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.638710 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle\") pod \"e4be45a7-4886-405d-b10b-0519258715f6\" (UID: \"e4be45a7-4886-405d-b10b-0519258715f6\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.640009 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.640027 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.640045 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xmgq\" (UniqueName: \"kubernetes.io/projected/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef-kube-api-access-5xmgq\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.664989 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts" (OuterVolumeSpecName: "scripts") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.667232 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.667359 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v" (OuterVolumeSpecName: "kube-api-access-rvr8v") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "kube-api-access-rvr8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.667285 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.678194 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.678560 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data" (OuterVolumeSpecName: "config-data") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.739089 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742323 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742349 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvr8v\" (UniqueName: \"kubernetes.io/projected/e4be45a7-4886-405d-b10b-0519258715f6-kube-api-access-rvr8v\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742361 4810 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742369 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742378 4810 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742386 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.742393 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.762530 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4be45a7-4886-405d-b10b-0519258715f6" (UID: "e4be45a7-4886-405d-b10b-0519258715f6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.769400 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843242 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843288 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjk96\" (UniqueName: \"kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843406 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843431 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843475 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843527 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843564 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843582 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data\") pod \"1b8a0d09-74be-4ada-a91c-90a30042cc32\" (UID: \"1b8a0d09-74be-4ada-a91c-90a30042cc32\") " Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.843971 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4be45a7-4886-405d-b10b-0519258715f6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.844098 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.844918 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.849275 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96" (OuterVolumeSpecName: "kube-api-access-tjk96") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "kube-api-access-tjk96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.877938 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts" (OuterVolumeSpecName: "scripts") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.925437 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.934268 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.945145 4810 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.945613 4810 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.945764 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.945835 4810 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1b8a0d09-74be-4ada-a91c-90a30042cc32-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.945937 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjk96\" (UniqueName: \"kubernetes.io/projected/1b8a0d09-74be-4ada-a91c-90a30042cc32-kube-api-access-tjk96\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.976791 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983312 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983404 4810 generic.go:334] "Generic (PLEG): container finished" podID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" exitCode=0 Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983477 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983516 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerDied","Data":"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8"} Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983549 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68d389d4-c420-497b-81aa-3277eaeb1a13","Type":"ContainerDied","Data":"085ecb8ca9587e269951638a2a1f72a3180c5580db17c7ce5fa8b556d397dc67"} Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.983568 4810 scope.go:117] "RemoveContainer" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.994675 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_db2735e6-11ae-45cc-94fe-9628468a810a/ovn-northd/0.log" Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.994718 4810 generic.go:334] "Generic (PLEG): container finished" podID="db2735e6-11ae-45cc-94fe-9628468a810a" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" exitCode=139 Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.994773 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerDied","Data":"68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d"} Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.996714 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e1ba8eff-5520-4703-911a-992a05fc8bdc","Type":"ContainerDied","Data":"29e51125705b55d6c153d6179c026556a59748c93650281561316eba48e19e72"} Oct 03 09:12:15 crc kubenswrapper[4810]: I1003 09:12:15.996793 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.004710 4810 generic.go:334] "Generic (PLEG): container finished" podID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" exitCode=0 Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.004809 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef","Type":"ContainerDied","Data":"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787"} Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.004841 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"eb12a3c2-3c6e-437a-ac06-021b2c18c4ef","Type":"ContainerDied","Data":"afbb2bf3aba0242f71f063b43970d8004c452d0196ad73904a67696ed10f99b8"} Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.004965 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.014129 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b9848c8cf-kb9qj" event={"ID":"e4be45a7-4886-405d-b10b-0519258715f6","Type":"ContainerDied","Data":"9ea66bcfc2dff6a655069d90d0759d1578a75d48dfcdacd549e05651ed23cbe6"} Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.014257 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b9848c8cf-kb9qj" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.019283 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data" (OuterVolumeSpecName: "config-data") pod "1b8a0d09-74be-4ada-a91c-90a30042cc32" (UID: "1b8a0d09-74be-4ada-a91c-90a30042cc32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.047370 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.047557 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.047584 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.047969 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.048072 4810 scope.go:117] "RemoveContainer" containerID="62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.048656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522z5\" (UniqueName: \"kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.048706 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.049008 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.049027 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.049061 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.049932 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.049983 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052277 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052351 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") pod \"68d389d4-c420-497b-81aa-3277eaeb1a13\" (UID: \"68d389d4-c420-497b-81aa-3277eaeb1a13\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052909 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052928 4810 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052937 4810 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052945 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052953 4810 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052961 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68d389d4-c420-497b-81aa-3277eaeb1a13-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.052969 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b8a0d09-74be-4ada-a91c-90a30042cc32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.055094 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5" (OuterVolumeSpecName: "kube-api-access-522z5") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "kube-api-access-522z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.059693 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.060901 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets" (OuterVolumeSpecName: "secrets") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.066544 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.066738 4810 generic.go:334] "Generic (PLEG): container finished" podID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerID="dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52" exitCode=0 Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.066845 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.066862 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerDied","Data":"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52"} Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.066921 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1b8a0d09-74be-4ada-a91c-90a30042cc32","Type":"ContainerDied","Data":"e1b5017222ecf21b85d6a7a9f987e3981f732f94e571015860f95d8d17f2d5c3"} Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.087563 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.091816 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6" (OuterVolumeSpecName: "mysql-db") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.126878 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "68d389d4-c420-497b-81aa-3277eaeb1a13" (UID: "68d389d4-c420-497b-81aa-3277eaeb1a13"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.154998 4810 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.155032 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.155056 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") on node \"crc\" " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.155066 4810 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68d389d4-c420-497b-81aa-3277eaeb1a13-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.155077 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522z5\" (UniqueName: \"kubernetes.io/projected/68d389d4-c420-497b-81aa-3277eaeb1a13-kube-api-access-522z5\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.182954 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.183100 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6") on node "crc" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.202853 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.202987 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.110:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.241001 4810 scope.go:117] "RemoveContainer" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.241593 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8\": container with ID starting with 69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8 not found: ID does not exist" containerID="69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.241655 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8"} err="failed to get container status \"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8\": rpc error: code = NotFound desc = could not find container \"69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8\": container with ID starting with 69d3bb512a10e52272a60f7bd976fbc5b78d631d571bec03f07f1f8e3ee4a6c8 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.241683 4810 scope.go:117] "RemoveContainer" containerID="62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.242089 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece\": container with ID starting with 62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece not found: ID does not exist" containerID="62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.242118 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece"} err="failed to get container status \"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece\": rpc error: code = NotFound desc = could not find container \"62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece\": container with ID starting with 62099f01a66f1b446f3be1a58c0d6278e788fc882ebfc60fdc128d87abbacece not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.242162 4810 scope.go:117] "RemoveContainer" containerID="6c77c67453423e12da29e5a26e4c78b3f22564d268f5f9d6693c48a4c1fe3834" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.256647 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56bb816b-bb7a-4f09-a75a-b82160a1bcf6\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.271031 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_db2735e6-11ae-45cc-94fe-9628468a810a/ovn-northd/0.log" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.271128 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.283673 4810 scope.go:117] "RemoveContainer" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.296799 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.318013 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.334577 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.343336 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.343939 4810 scope.go:117] "RemoveContainer" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.345361 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787\": container with ID starting with d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787 not found: ID does not exist" containerID="d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.345400 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787"} err="failed to get container status \"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787\": rpc error: code = NotFound desc = could not find container \"d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787\": container with ID starting with d3b54b8d60a9464c44c3e8ff03e5ba0300aa251818f8afdbad7c4502c8d3c787 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.345429 4810 scope.go:117] "RemoveContainer" containerID="3db6c0f0fe1196ece0d1fd23d0a431452da6842720d8625cef9b225452286fdb" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.349346 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.359511 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgn2w\" (UniqueName: \"kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.359677 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.359709 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.359733 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.360739 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config" (OuterVolumeSpecName: "config") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.361321 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.361851 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts" (OuterVolumeSpecName: "scripts") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.365149 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b9848c8cf-kb9qj"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.375098 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w" (OuterVolumeSpecName: "kube-api-access-rgn2w") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "kube-api-access-rgn2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.381977 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.382132 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.382163 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs\") pod \"db2735e6-11ae-45cc-94fe-9628468a810a\" (UID: \"db2735e6-11ae-45cc-94fe-9628468a810a\") " Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.382860 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.383080 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.383108 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/db2735e6-11ae-45cc-94fe-9628468a810a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.383123 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgn2w\" (UniqueName: \"kubernetes.io/projected/db2735e6-11ae-45cc-94fe-9628468a810a-kube-api-access-rgn2w\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.385034 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.388559 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.395210 4810 scope.go:117] "RemoveContainer" containerID="dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.402878 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.463236 4810 scope.go:117] "RemoveContainer" containerID="8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.466758 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.482167 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "db2735e6-11ae-45cc-94fe-9628468a810a" (UID: "db2735e6-11ae-45cc-94fe-9628468a810a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.484918 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.484955 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.484969 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/db2735e6-11ae-45cc-94fe-9628468a810a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.491694 4810 scope.go:117] "RemoveContainer" containerID="dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.515483 4810 scope.go:117] "RemoveContainer" containerID="2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.549580 4810 scope.go:117] "RemoveContainer" containerID="dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.550665 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62\": container with ID starting with dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62 not found: ID does not exist" containerID="dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.550703 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62"} err="failed to get container status \"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62\": rpc error: code = NotFound desc = could not find container \"dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62\": container with ID starting with dcc0d86131e3b571bc43e1090050d82e614abddac31f7b94dfe850e4b76f6d62 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.550728 4810 scope.go:117] "RemoveContainer" containerID="8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.551668 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318\": container with ID starting with 8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318 not found: ID does not exist" containerID="8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.551695 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318"} err="failed to get container status \"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318\": rpc error: code = NotFound desc = could not find container \"8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318\": container with ID starting with 8e058987ee514770d3d2491a8ddc6b0743a88a1552f2ccb382ea6eb5d22fd318 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.551710 4810 scope.go:117] "RemoveContainer" containerID="dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.552888 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52\": container with ID starting with dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52 not found: ID does not exist" containerID="dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.552976 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52"} err="failed to get container status \"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52\": rpc error: code = NotFound desc = could not find container \"dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52\": container with ID starting with dd3a4cfea3f88dc725e63c3f73a0cdb86a1f284c2cb8a61c4c40db972e5d4c52 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.553020 4810 scope.go:117] "RemoveContainer" containerID="2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8" Oct 03 09:12:16 crc kubenswrapper[4810]: E1003 09:12:16.553569 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8\": container with ID starting with 2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8 not found: ID does not exist" containerID="2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.553615 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8"} err="failed to get container status \"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8\": rpc error: code = NotFound desc = could not find container \"2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8\": container with ID starting with 2e5ac2877c7f59077ea62dad916e4578b7c1d368937ab75ad477381efd3f92d8 not found: ID does not exist" Oct 03 09:12:16 crc kubenswrapper[4810]: I1003 09:12:16.720099 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.1.105:6080/vnc_lite.html\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.092798 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_db2735e6-11ae-45cc-94fe-9628468a810a/ovn-northd/0.log" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.092909 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"db2735e6-11ae-45cc-94fe-9628468a810a","Type":"ContainerDied","Data":"845ec6c9618d658b236e8d9404df56719ec237542322fc2b2c727c24ae48340d"} Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.092958 4810 scope.go:117] "RemoveContainer" containerID="1de9cdf0c83a3fe94c13ec3cd20f602220b80934411f5a0d8ab59ac3fce98075" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.092983 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.116191 4810 scope.go:117] "RemoveContainer" containerID="68877aeaeea596723136e156f4d03766eedc46611ea0054f93ee00ff18a29c4d" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.143229 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.157598 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.314197 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" path="/var/lib/kubelet/pods/1b8a0d09-74be-4ada-a91c-90a30042cc32/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.315025 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" path="/var/lib/kubelet/pods/68d389d4-c420-497b-81aa-3277eaeb1a13/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.316103 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" path="/var/lib/kubelet/pods/db2735e6-11ae-45cc-94fe-9628468a810a/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.316652 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" path="/var/lib/kubelet/pods/e1ba8eff-5520-4703-911a-992a05fc8bdc/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.317209 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4be45a7-4886-405d-b10b-0519258715f6" path="/var/lib/kubelet/pods/e4be45a7-4886-405d-b10b-0519258715f6/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.318204 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" path="/var/lib/kubelet/pods/eb12a3c2-3c6e-437a-ac06-021b2c18c4ef/volumes" Oct 03 09:12:17 crc kubenswrapper[4810]: I1003 09:12:17.459963 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.149:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:18 crc kubenswrapper[4810]: I1003 09:12:18.082683 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 09:12:18 crc kubenswrapper[4810]: I1003 09:12:18.082901 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="b1164f6f-490c-4ade-9824-9157b3b46be2" containerName="adoption" containerID="cri-o://f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68" gracePeriod=30 Oct 03 09:12:18 crc kubenswrapper[4810]: I1003 09:12:18.350136 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 09:12:18 crc kubenswrapper[4810]: I1003 09:12:18.350374 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" containerName="adoption" containerID="cri-o://44a3da3283de87f0f425a172444a706cfee46d2186e7801031398a2bcfb7e8dc" gracePeriod=30 Oct 03 09:12:19 crc kubenswrapper[4810]: I1003 09:12:19.029117 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:19 crc kubenswrapper[4810]: E1003 09:12:19.029305 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:19 crc kubenswrapper[4810]: E1003 09:12:19.029403 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:35.029378337 +0000 UTC m=+8188.456629092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:19 crc kubenswrapper[4810]: I1003 09:12:19.129710 4810 generic.go:334] "Generic (PLEG): container finished" podID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerID="3ef0c263e1c8f4ced443d655ece4132e18690001db86325aea223aa3d58b358e" exitCode=0 Oct 03 09:12:19 crc kubenswrapper[4810]: I1003 09:12:19.129774 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerDied","Data":"3ef0c263e1c8f4ced443d655ece4132e18690001db86325aea223aa3d58b358e"} Oct 03 09:12:19 crc kubenswrapper[4810]: I1003 09:12:19.130872 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:19 crc kubenswrapper[4810]: E1003 09:12:19.136713 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:19 crc kubenswrapper[4810]: E1003 09:12:19.136792 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:12:35.136768373 +0000 UTC m=+8188.564019118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:20 crc kubenswrapper[4810]: I1003 09:12:20.713683 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Oct 03 09:12:21 crc kubenswrapper[4810]: I1003 09:12:21.150165 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerStarted","Data":"4bc03279b3229cbb4076beb0e89512d204882c5eb931d0c3923452f7172630da"} Oct 03 09:12:21 crc kubenswrapper[4810]: I1003 09:12:21.176677 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p5tt" podStartSLOduration=7.087548164 podStartE2EDuration="15.176660737s" podCreationTimestamp="2025-10-03 09:12:06 +0000 UTC" firstStartedPulling="2025-10-03 09:12:12.211011291 +0000 UTC m=+8165.638262026" lastFinishedPulling="2025-10-03 09:12:20.300123864 +0000 UTC m=+8173.727374599" observedRunningTime="2025-10-03 09:12:21.170580725 +0000 UTC m=+8174.597831480" watchObservedRunningTime="2025-10-03 09:12:21.176660737 +0000 UTC m=+8174.603911472" Oct 03 09:12:21 crc kubenswrapper[4810]: I1003 09:12:21.264531 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-d6cf467cf-6jvjp" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.1.132:8004/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:21 crc kubenswrapper[4810]: I1003 09:12:21.302154 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-577984bbd4-45rx5" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.1.133:8000/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:12:24 crc kubenswrapper[4810]: I1003 09:12:24.244193 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-b9499669-jd95s" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.56:9696/\": dial tcp 10.217.1.56:9696: connect: connection refused" Oct 03 09:12:24 crc kubenswrapper[4810]: E1003 09:12:24.628231 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:24 crc kubenswrapper[4810]: E1003 09:12:24.629756 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:24 crc kubenswrapper[4810]: E1003 09:12:24.631423 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:24 crc kubenswrapper[4810]: E1003 09:12:24.631473 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:12:24 crc kubenswrapper[4810]: I1003 09:12:24.853524 4810 scope.go:117] "RemoveContainer" containerID="f0b610453374d95fa8d8b8835a740b4722435108e83542f0e16bb20b2adca98f" Oct 03 09:12:24 crc kubenswrapper[4810]: I1003 09:12:24.881093 4810 scope.go:117] "RemoveContainer" containerID="e14f117901d3ecf707af80f5154071e005064daf8d9971cd1b6b48315e536ea9" Oct 03 09:12:24 crc kubenswrapper[4810]: I1003 09:12:24.910468 4810 scope.go:117] "RemoveContainer" containerID="4c55292548a4d03c22af14ea11a94e15075846c2b8747762ba1fb825a828816a" Oct 03 09:12:24 crc kubenswrapper[4810]: I1003 09:12:24.942227 4810 scope.go:117] "RemoveContainer" containerID="6fe224bc71d716f02d19e02a0d34c59755d48f97a13da3c8af44eb5869dc1b3f" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.005124 4810 scope.go:117] "RemoveContainer" containerID="69c78926137ade89dbe6277076f566f6eb2cbdab8b9d13d7f01e5c19bd597377" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.026034 4810 scope.go:117] "RemoveContainer" containerID="0438a7801eb086bb2f2988fbf6a2faaadb40a8b29e9157eea6ecfa708ea3864e" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.145679 4810 scope.go:117] "RemoveContainer" containerID="c83c3fd2be32d74daf1dccbe626b87b0104daacbbfa8a4c68ca2d115aefcc9c6" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.317926 4810 scope.go:117] "RemoveContainer" containerID="211daa9ed516000f1909977a5746e579501a880585f3b68881c446794b0cd9ed" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.348980 4810 scope.go:117] "RemoveContainer" containerID="be1d419bdcaddfa86e01ac0347a626d35490d6b41d2b7976a512edfe5809af3c" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.372657 4810 scope.go:117] "RemoveContainer" containerID="ccaddef8fdc23da0618108ad3920e7dc36166714d51af18712d008fbb1ee8cf6" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.395521 4810 scope.go:117] "RemoveContainer" containerID="e5de03c1850b861978cbefa9ed2154f1aa86748c6ffcdac9186479f812fe22bd" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.420632 4810 scope.go:117] "RemoveContainer" containerID="ba6cb84bebecb7683aa43366962d518dd88cdaed82bb6fafec8f5209c8020422" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.446627 4810 scope.go:117] "RemoveContainer" containerID="ff9e2e550b1bf5f4b6636a6d608048c84dd243ea8639895d89b0add8d1330eac" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.493370 4810 scope.go:117] "RemoveContainer" containerID="f8e127303bc8a535b97b0d6e9b163d2ee3db2eaaf4c2ccfc515413c4ceb5fcbc" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.536386 4810 scope.go:117] "RemoveContainer" containerID="ba8f23b00e574ed38b94b464a2550358de6f47b13a732b76199ab43ac8a36415" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.588402 4810 scope.go:117] "RemoveContainer" containerID="5b4764a5166f67c1b430dfc61b76795284f3ddc58b5b1b1ce0e10d2c554b70f0" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.618838 4810 scope.go:117] "RemoveContainer" containerID="3ea1376bf78156d4617497ed5e84f891a123673852c2c754a207468608cc80b4" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.678578 4810 scope.go:117] "RemoveContainer" containerID="cbb2088da894702fc6a881a5dab4338b4a03c2b8469ca40cdbf5ae5f3ce22b40" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.801741 4810 scope.go:117] "RemoveContainer" containerID="aafd95e2c2156c5d62c9e52e62267e00c8cc9c2c5da26c2f67f4a5b4754d0e28" Oct 03 09:12:25 crc kubenswrapper[4810]: I1003 09:12:25.840398 4810 scope.go:117] "RemoveContainer" containerID="67b54925633fb7897de4f9db6a5d84178f3f191ee9b36f058f8d0c7e2ef38c22" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.005677 4810 scope.go:117] "RemoveContainer" containerID="dbecd6cb4776c31357e1271ab6ac3c843e982451b993b84ecf53c0a4ebaabc01" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.036871 4810 scope.go:117] "RemoveContainer" containerID="a9eb5c4e834ec18df22b9dc93a6f7d4c539136ebfda5ef3d4283bbbf3ada1234" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.074396 4810 scope.go:117] "RemoveContainer" containerID="a2c2611ee5c4b7a3eb513f4e9f6d3e9dfa093b9ff621e7dcd667a7d9a57b1f3e" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.110746 4810 scope.go:117] "RemoveContainer" containerID="d5561a2dc144a26ef50427c1350893c3889c01df11510ef51f59866fb9c09a65" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.149401 4810 scope.go:117] "RemoveContainer" containerID="3131e01fff21547ea79afbfa86fce6c8d593859c7204fdc10be84801049c36f2" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.209635 4810 scope.go:117] "RemoveContainer" containerID="efbbea635fd7c2512387b085e21c476168a7ee9b5fafe68ca05dc78b78d49a5c" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.289861 4810 scope.go:117] "RemoveContainer" containerID="e38e84bc6b6ee906a778b23d1848296550b447437e04a7e712e4ebf93b978f4a" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.339851 4810 scope.go:117] "RemoveContainer" containerID="d7650812d29d222e124e1900773dc3db579c0939ad0977b0cc2d623d76fc78cc" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.492458 4810 scope.go:117] "RemoveContainer" containerID="276347e95ea2f8ef1722075a72ac28686ed4623fefd6f74a753d70385f27ad2e" Oct 03 09:12:26 crc kubenswrapper[4810]: I1003 09:12:26.745401 4810 scope.go:117] "RemoveContainer" containerID="cdda3f94e72b2e0fc1363ad8dec85327b12a98239d858659e6c717c1fc77cf5f" Oct 03 09:12:27 crc kubenswrapper[4810]: I1003 09:12:27.122388 4810 scope.go:117] "RemoveContainer" containerID="5f638f1e284830d337b62e2a6e0e41447ef9f5c27392c52ae056d6b9e2684516" Oct 03 09:12:27 crc kubenswrapper[4810]: I1003 09:12:27.287452 4810 scope.go:117] "RemoveContainer" containerID="ff15e78219b535ad72c83f852eb4e194818599bd9524f809a3c98675ed9023ed" Oct 03 09:12:27 crc kubenswrapper[4810]: I1003 09:12:27.341117 4810 scope.go:117] "RemoveContainer" containerID="756e3fa73811a7d6d4b37ad57ce515d218bc088c49d197f258e0fc1126478923" Oct 03 09:12:28 crc kubenswrapper[4810]: I1003 09:12:28.358790 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:28 crc kubenswrapper[4810]: I1003 09:12:28.359649 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:28 crc kubenswrapper[4810]: I1003 09:12:28.360832 4810 generic.go:334] "Generic (PLEG): container finished" podID="48f62813-7b25-45b6-9942-a5e54b40a235" containerID="b96ac21f6b885aebfbc17e90eb2244817f36b2064b8dcd70ce7c7d24cb87aa0f" exitCode=0 Oct 03 09:12:28 crc kubenswrapper[4810]: I1003 09:12:28.360931 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerDied","Data":"b96ac21f6b885aebfbc17e90eb2244817f36b2064b8dcd70ce7c7d24cb87aa0f"} Oct 03 09:12:28 crc kubenswrapper[4810]: I1003 09:12:28.416934 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.145793 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9499669-jd95s" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.195974 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196020 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196104 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196131 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqv7\" (UniqueName: \"kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196150 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196183 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.196263 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs\") pod \"48f62813-7b25-45b6-9942-a5e54b40a235\" (UID: \"48f62813-7b25-45b6-9942-a5e54b40a235\") " Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.214933 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.216125 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7" (OuterVolumeSpecName: "kube-api-access-mfqv7") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "kube-api-access-mfqv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.247684 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.265460 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.267465 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.284157 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config" (OuterVolumeSpecName: "config") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298080 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "48f62813-7b25-45b6-9942-a5e54b40a235" (UID: "48f62813-7b25-45b6-9942-a5e54b40a235"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298268 4810 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298283 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298294 4810 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298303 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqv7\" (UniqueName: \"kubernetes.io/projected/48f62813-7b25-45b6-9942-a5e54b40a235-kube-api-access-mfqv7\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298314 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298322 4810 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.298330 4810 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48f62813-7b25-45b6-9942-a5e54b40a235-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.373698 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b9499669-jd95s" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.373739 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b9499669-jd95s" event={"ID":"48f62813-7b25-45b6-9942-a5e54b40a235","Type":"ContainerDied","Data":"410fa5f41d72d766d3ada43a743ac5d2d88f201f43172b3856f15637e8ce18cd"} Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.374147 4810 scope.go:117] "RemoveContainer" containerID="7f5bf483256e9e40c501144201c3dc96463a540abcc9d8379933be6e0f86f926" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.401703 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.408234 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b9499669-jd95s"] Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.423541 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.479440 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:29 crc kubenswrapper[4810]: I1003 09:12:29.496384 4810 scope.go:117] "RemoveContainer" containerID="b96ac21f6b885aebfbc17e90eb2244817f36b2064b8dcd70ce7c7d24cb87aa0f" Oct 03 09:12:30 crc kubenswrapper[4810]: I1003 09:12:30.714053 4810 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c58b6647b-ttkrv" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.120:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.120:8443: connect: connection refused" Oct 03 09:12:30 crc kubenswrapper[4810]: I1003 09:12:30.714173 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:12:31 crc kubenswrapper[4810]: I1003 09:12:31.314225 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" path="/var/lib/kubelet/pods/48f62813-7b25-45b6-9942-a5e54b40a235/volumes" Oct 03 09:12:31 crc kubenswrapper[4810]: I1003 09:12:31.395934 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8p5tt" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="registry-server" containerID="cri-o://4bc03279b3229cbb4076beb0e89512d204882c5eb931d0c3923452f7172630da" gracePeriod=2 Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.416704 4810 generic.go:334] "Generic (PLEG): container finished" podID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerID="4bc03279b3229cbb4076beb0e89512d204882c5eb931d0c3923452f7172630da" exitCode=0 Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.417190 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerDied","Data":"4bc03279b3229cbb4076beb0e89512d204882c5eb931d0c3923452f7172630da"} Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.417226 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p5tt" event={"ID":"73897ece-d31a-4b17-aae9-1b781bc5dc44","Type":"ContainerDied","Data":"17621b6427bae8344b55e256af9f0a6d165ee530a41dd7483c1ce325ca42d88d"} Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.417243 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17621b6427bae8344b55e256af9f0a6d165ee530a41dd7483c1ce325ca42d88d" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.439117 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.559747 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content\") pod \"73897ece-d31a-4b17-aae9-1b781bc5dc44\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.559807 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj\") pod \"73897ece-d31a-4b17-aae9-1b781bc5dc44\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.559868 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities\") pod \"73897ece-d31a-4b17-aae9-1b781bc5dc44\" (UID: \"73897ece-d31a-4b17-aae9-1b781bc5dc44\") " Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.561148 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities" (OuterVolumeSpecName: "utilities") pod "73897ece-d31a-4b17-aae9-1b781bc5dc44" (UID: "73897ece-d31a-4b17-aae9-1b781bc5dc44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.566654 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj" (OuterVolumeSpecName: "kube-api-access-rq5pj") pod "73897ece-d31a-4b17-aae9-1b781bc5dc44" (UID: "73897ece-d31a-4b17-aae9-1b781bc5dc44"). InnerVolumeSpecName "kube-api-access-rq5pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.661620 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq5pj\" (UniqueName: \"kubernetes.io/projected/73897ece-d31a-4b17-aae9-1b781bc5dc44-kube-api-access-rq5pj\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.661671 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.694660 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73897ece-d31a-4b17-aae9-1b781bc5dc44" (UID: "73897ece-d31a-4b17-aae9-1b781bc5dc44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:32 crc kubenswrapper[4810]: I1003 09:12:32.763074 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73897ece-d31a-4b17-aae9-1b781bc5dc44-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:33 crc kubenswrapper[4810]: E1003 09:12:33.390819 4810 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73897ece_d31a_4b17_aae9_1b781bc5dc44.slice/crio-17621b6427bae8344b55e256af9f0a6d165ee530a41dd7483c1ce325ca42d88d\": RecentStats: unable to find data in memory cache]" Oct 03 09:12:33 crc kubenswrapper[4810]: I1003 09:12:33.448249 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p5tt" Oct 03 09:12:33 crc kubenswrapper[4810]: I1003 09:12:33.470457 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:33 crc kubenswrapper[4810]: I1003 09:12:33.475762 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8p5tt"] Oct 03 09:12:34 crc kubenswrapper[4810]: E1003 09:12:34.624465 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:34 crc kubenswrapper[4810]: E1003 09:12:34.626716 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:34 crc kubenswrapper[4810]: E1003 09:12:34.628837 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:34 crc kubenswrapper[4810]: E1003 09:12:34.628881 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.023294 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121321 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121378 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121413 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmzft\" (UniqueName: \"kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121506 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121581 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121617 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121656 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs\") pod \"d2aaa8b9-8b22-47de-9476-06746378f92f\" (UID: \"d2aaa8b9-8b22-47de-9476-06746378f92f\") " Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.121913 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.122137 4810 secret.go:188] Couldn't get secret openstack/heat-config-data: secret "heat-config-data" not found Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.122205 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:13:07.122186352 +0000 UTC m=+8220.549437087 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : secret "heat-config-data" not found Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.127020 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft" (OuterVolumeSpecName: "kube-api-access-hmzft") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "kube-api-access-hmzft". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.127141 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.127503 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs" (OuterVolumeSpecName: "logs") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.142548 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data" (OuterVolumeSpecName: "config-data") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.146791 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.153200 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts" (OuterVolumeSpecName: "scripts") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.171981 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "d2aaa8b9-8b22-47de-9476-06746378f92f" (UID: "d2aaa8b9-8b22-47de-9476-06746378f92f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224023 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") pod \"heat-engine-76bc9bd849-gxqg6\" (UID: \"95eba70b-a04a-4ca8-8e43-0ee212328321\") " pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224120 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224133 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmzft\" (UniqueName: \"kubernetes.io/projected/d2aaa8b9-8b22-47de-9476-06746378f92f-kube-api-access-hmzft\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224144 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224153 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d2aaa8b9-8b22-47de-9476-06746378f92f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224201 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224212 4810 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2aaa8b9-8b22-47de-9476-06746378f92f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.224220 4810 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2aaa8b9-8b22-47de-9476-06746378f92f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.228091 4810 projected.go:194] Error preparing data for projected volume kube-api-access-fscfn for pod openstack/heat-engine-76bc9bd849-gxqg6: failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.228178 4810 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn podName:95eba70b-a04a-4ca8-8e43-0ee212328321 nodeName:}" failed. No retries permitted until 2025-10-03 09:13:07.228156452 +0000 UTC m=+8220.655407187 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fscfn" (UniqueName: "kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn") pod "heat-engine-76bc9bd849-gxqg6" (UID: "95eba70b-a04a-4ca8-8e43-0ee212328321") : failed to fetch token: serviceaccounts "heat-heat" not found Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.313466 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" path="/var/lib/kubelet/pods/73897ece-d31a-4b17-aae9-1b781bc5dc44/volumes" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.462555 4810 generic.go:334] "Generic (PLEG): container finished" podID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerID="14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409" exitCode=137 Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.462603 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c58b6647b-ttkrv" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.462604 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerDied","Data":"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409"} Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.462700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c58b6647b-ttkrv" event={"ID":"d2aaa8b9-8b22-47de-9476-06746378f92f","Type":"ContainerDied","Data":"15f1e51810efa2f0f3527e2a0cfc1ae889104a2578b799f81ee99c31f32ccf48"} Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.462725 4810 scope.go:117] "RemoveContainer" containerID="a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.494535 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.501459 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c58b6647b-ttkrv"] Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.611075 4810 scope.go:117] "RemoveContainer" containerID="14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.627108 4810 scope.go:117] "RemoveContainer" containerID="a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d" Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.628634 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d\": container with ID starting with a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d not found: ID does not exist" containerID="a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.628674 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d"} err="failed to get container status \"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d\": rpc error: code = NotFound desc = could not find container \"a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d\": container with ID starting with a0b78f9cc77753dda387997694e97aae308f687f89cc3377e72a5da7abfe665d not found: ID does not exist" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.628703 4810 scope.go:117] "RemoveContainer" containerID="14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409" Oct 03 09:12:35 crc kubenswrapper[4810]: E1003 09:12:35.629043 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409\": container with ID starting with 14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409 not found: ID does not exist" containerID="14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409" Oct 03 09:12:35 crc kubenswrapper[4810]: I1003 09:12:35.629104 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409"} err="failed to get container status \"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409\": rpc error: code = NotFound desc = could not find container \"14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409\": container with ID starting with 14b2d7edc689db281ec3c3400f6d7a82a9d6e4878500ba29e71a750d6b9d7409 not found: ID does not exist" Oct 03 09:12:37 crc kubenswrapper[4810]: I1003 09:12:37.317685 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" path="/var/lib/kubelet/pods/d2aaa8b9-8b22-47de-9476-06746378f92f/volumes" Oct 03 09:12:37 crc kubenswrapper[4810]: I1003 09:12:37.859559 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc783295d-2aa9-46e1-baad-eef94c60dc6f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc783295d-2aa9-46e1-baad-eef94c60dc6f] : Timed out while waiting for systemd to remove kubepods-besteffort-podc783295d_2aa9_46e1_baad_eef94c60dc6f.slice" Oct 03 09:12:37 crc kubenswrapper[4810]: E1003 09:12:37.859613 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podc783295d-2aa9-46e1-baad-eef94c60dc6f] : unable to destroy cgroup paths for cgroup [kubepods besteffort podc783295d-2aa9-46e1-baad-eef94c60dc6f] : Timed out while waiting for systemd to remove kubepods-besteffort-podc783295d_2aa9_46e1_baad_eef94c60dc6f.slice" pod="openstack/openstackclient" podUID="c783295d-2aa9-46e1-baad-eef94c60dc6f" Oct 03 09:12:38 crc kubenswrapper[4810]: I1003 09:12:38.487181 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 09:12:41 crc kubenswrapper[4810]: I1003 09:12:41.715212 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod95eba70b-a04a-4ca8-8e43-0ee212328321"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod95eba70b-a04a-4ca8-8e43-0ee212328321] : Timed out while waiting for systemd to remove kubepods-besteffort-pod95eba70b_a04a_4ca8_8e43_0ee212328321.slice" Oct 03 09:12:41 crc kubenswrapper[4810]: E1003 09:12:41.715581 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod95eba70b-a04a-4ca8-8e43-0ee212328321] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod95eba70b-a04a-4ca8-8e43-0ee212328321] : Timed out while waiting for systemd to remove kubepods-besteffort-pod95eba70b_a04a_4ca8_8e43_0ee212328321.slice" pod="openstack/heat-engine-76bc9bd849-gxqg6" podUID="95eba70b-a04a-4ca8-8e43-0ee212328321" Oct 03 09:12:41 crc kubenswrapper[4810]: I1003 09:12:41.730962 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod64afa73e-dc4b-4a80-9597-0c552e43f979"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod64afa73e-dc4b-4a80-9597-0c552e43f979] : Timed out while waiting for systemd to remove kubepods-besteffort-pod64afa73e_dc4b_4a80_9597_0c552e43f979.slice" Oct 03 09:12:41 crc kubenswrapper[4810]: E1003 09:12:41.731009 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod64afa73e-dc4b-4a80-9597-0c552e43f979] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod64afa73e-dc4b-4a80-9597-0c552e43f979] : Timed out while waiting for systemd to remove kubepods-besteffort-pod64afa73e_dc4b_4a80_9597_0c552e43f979.slice" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" Oct 03 09:12:41 crc kubenswrapper[4810]: I1003 09:12:41.734239 4810 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7d4234c2-ea29-4c53-aea2-05cbaee65464"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d4234c2-ea29-4c53-aea2-05cbaee65464] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d4234c2_ea29_4c53_aea2_05cbaee65464.slice" Oct 03 09:12:41 crc kubenswrapper[4810]: E1003 09:12:41.734330 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7d4234c2-ea29-4c53-aea2-05cbaee65464] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7d4234c2-ea29-4c53-aea2-05cbaee65464] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7d4234c2_ea29_4c53_aea2_05cbaee65464.slice" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.517562 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76bc9bd849-gxqg6" Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.518678 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5cf66c6fc5-c9szq" Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.519085 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-d546c7c6d-hsgpr" Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.586913 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.593226 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-d546c7c6d-hsgpr"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.609084 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76bc9bd849-gxqg6"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.618307 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-76bc9bd849-gxqg6"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.635519 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.640865 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5cf66c6fc5-c9szq"] Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.745876 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fscfn\" (UniqueName: \"kubernetes.io/projected/95eba70b-a04a-4ca8-8e43-0ee212328321-kube-api-access-fscfn\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:42 crc kubenswrapper[4810]: I1003 09:12:42.745956 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95eba70b-a04a-4ca8-8e43-0ee212328321-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:43 crc kubenswrapper[4810]: I1003 09:12:43.319360 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64afa73e-dc4b-4a80-9597-0c552e43f979" path="/var/lib/kubelet/pods/64afa73e-dc4b-4a80-9597-0c552e43f979/volumes" Oct 03 09:12:43 crc kubenswrapper[4810]: I1003 09:12:43.320481 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4234c2-ea29-4c53-aea2-05cbaee65464" path="/var/lib/kubelet/pods/7d4234c2-ea29-4c53-aea2-05cbaee65464/volumes" Oct 03 09:12:43 crc kubenswrapper[4810]: I1003 09:12:43.321510 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eba70b-a04a-4ca8-8e43-0ee212328321" path="/var/lib/kubelet/pods/95eba70b-a04a-4ca8-8e43-0ee212328321/volumes" Oct 03 09:12:44 crc kubenswrapper[4810]: E1003 09:12:44.630039 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:44 crc kubenswrapper[4810]: E1003 09:12:44.632279 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:44 crc kubenswrapper[4810]: E1003 09:12:44.634401 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:44 crc kubenswrapper[4810]: E1003 09:12:44.634439 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.568122 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.593313 4810 generic.go:334] "Generic (PLEG): container finished" podID="b1164f6f-490c-4ade-9824-9157b3b46be2" containerID="f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68" exitCode=137 Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.593373 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b1164f6f-490c-4ade-9824-9157b3b46be2","Type":"ContainerDied","Data":"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68"} Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.593402 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"b1164f6f-490c-4ade-9824-9157b3b46be2","Type":"ContainerDied","Data":"be550b9064fbc2e2ee2ab7c58ef633e383e9ad05e4dd05f9f745483b5125e271"} Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.593419 4810 scope.go:117] "RemoveContainer" containerID="f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.593526 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.605392 4810 generic.go:334] "Generic (PLEG): container finished" podID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" containerID="44a3da3283de87f0f425a172444a706cfee46d2186e7801031398a2bcfb7e8dc" exitCode=137 Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.605439 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"c2ad2114-1f17-4dee-ba01-428f6f8a33d8","Type":"ContainerDied","Data":"44a3da3283de87f0f425a172444a706cfee46d2186e7801031398a2bcfb7e8dc"} Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.645209 4810 scope.go:117] "RemoveContainer" containerID="f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68" Oct 03 09:12:48 crc kubenswrapper[4810]: E1003 09:12:48.645747 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68\": container with ID starting with f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68 not found: ID does not exist" containerID="f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.645771 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68"} err="failed to get container status \"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68\": rpc error: code = NotFound desc = could not find container \"f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68\": container with ID starting with f06a5fe4bcdd2bb5213f2ab48661179fbfc2ad8c0661cd573eb25af84e28cf68 not found: ID does not exist" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.667494 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") pod \"b1164f6f-490c-4ade-9824-9157b3b46be2\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.667711 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb4qn\" (UniqueName: \"kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn\") pod \"b1164f6f-490c-4ade-9824-9157b3b46be2\" (UID: \"b1164f6f-490c-4ade-9824-9157b3b46be2\") " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.678142 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn" (OuterVolumeSpecName: "kube-api-access-fb4qn") pod "b1164f6f-490c-4ade-9824-9157b3b46be2" (UID: "b1164f6f-490c-4ade-9824-9157b3b46be2"). InnerVolumeSpecName "kube-api-access-fb4qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.678295 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb" (OuterVolumeSpecName: "mariadb-data") pod "b1164f6f-490c-4ade-9824-9157b3b46be2" (UID: "b1164f6f-490c-4ade-9824-9157b3b46be2"). InnerVolumeSpecName "pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.769791 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") on node \"crc\" " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.769862 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb4qn\" (UniqueName: \"kubernetes.io/projected/b1164f6f-490c-4ade-9824-9157b3b46be2-kube-api-access-fb4qn\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.801780 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.818025 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.818199 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb") on node "crc" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.871462 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert\") pod \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.871699 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4xjv\" (UniqueName: \"kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv\") pod \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.872402 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") pod \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\" (UID: \"c2ad2114-1f17-4dee-ba01-428f6f8a33d8\") " Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.872882 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-27ed2a25-2ce4-4409-86b6-75fa51b337eb\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.874043 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv" (OuterVolumeSpecName: "kube-api-access-l4xjv") pod "c2ad2114-1f17-4dee-ba01-428f6f8a33d8" (UID: "c2ad2114-1f17-4dee-ba01-428f6f8a33d8"). InnerVolumeSpecName "kube-api-access-l4xjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.876406 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "c2ad2114-1f17-4dee-ba01-428f6f8a33d8" (UID: "c2ad2114-1f17-4dee-ba01-428f6f8a33d8"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.880064 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793" (OuterVolumeSpecName: "ovn-data") pod "c2ad2114-1f17-4dee-ba01-428f6f8a33d8" (UID: "c2ad2114-1f17-4dee-ba01-428f6f8a33d8"). InnerVolumeSpecName "pvc-840b500d-4959-4ff5-87e9-220ef56ba793". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.939878 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.959523 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.974566 4810 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.974604 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4xjv\" (UniqueName: \"kubernetes.io/projected/c2ad2114-1f17-4dee-ba01-428f6f8a33d8-kube-api-access-l4xjv\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:48 crc kubenswrapper[4810]: I1003 09:12:48.974645 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") on node \"crc\" " Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.073956 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.074330 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-840b500d-4959-4ff5-87e9-220ef56ba793" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793") on node "crc" Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.075867 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-840b500d-4959-4ff5-87e9-220ef56ba793\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-840b500d-4959-4ff5-87e9-220ef56ba793\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.324776 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1164f6f-490c-4ade-9824-9157b3b46be2" path="/var/lib/kubelet/pods/b1164f6f-490c-4ade-9824-9157b3b46be2/volumes" Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.618467 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"c2ad2114-1f17-4dee-ba01-428f6f8a33d8","Type":"ContainerDied","Data":"3a070ab9963c69ee57b08d9c2a160316baf19602b47c5022bf3be179f1c0db8c"} Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.619169 4810 scope.go:117] "RemoveContainer" containerID="44a3da3283de87f0f425a172444a706cfee46d2186e7801031398a2bcfb7e8dc" Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.619439 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.644802 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 09:12:49 crc kubenswrapper[4810]: I1003 09:12:49.651171 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 03 09:12:51 crc kubenswrapper[4810]: I1003 09:12:51.313206 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" path="/var/lib/kubelet/pods/c2ad2114-1f17-4dee-ba01-428f6f8a33d8/volumes" Oct 03 09:12:54 crc kubenswrapper[4810]: E1003 09:12:54.623339 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:54 crc kubenswrapper[4810]: E1003 09:12:54.625955 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:54 crc kubenswrapper[4810]: E1003 09:12:54.629178 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:12:54 crc kubenswrapper[4810]: E1003 09:12:54.629310 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:13:02 crc kubenswrapper[4810]: I1003 09:13:02.088376 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:02 crc kubenswrapper[4810]: I1003 09:13:02.089051 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:04 crc kubenswrapper[4810]: E1003 09:13:04.624217 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:13:04 crc kubenswrapper[4810]: E1003 09:13:04.627075 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:13:04 crc kubenswrapper[4810]: E1003 09:13:04.628810 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 03 09:13:04 crc kubenswrapper[4810]: E1003 09:13:04.628945 4810 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-55547566f9-2tzph" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:13:06 crc kubenswrapper[4810]: I1003 09:13:06.759537 4810 generic.go:334] "Generic (PLEG): container finished" podID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" exitCode=137 Oct 03 09:13:06 crc kubenswrapper[4810]: I1003 09:13:06.759644 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55547566f9-2tzph" event={"ID":"4474daf6-43cd-4602-895c-c4ca53b7b35d","Type":"ContainerDied","Data":"f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c"} Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.198584 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.268196 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc6wr\" (UniqueName: \"kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr\") pod \"4474daf6-43cd-4602-895c-c4ca53b7b35d\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.268568 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle\") pod \"4474daf6-43cd-4602-895c-c4ca53b7b35d\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.268663 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data\") pod \"4474daf6-43cd-4602-895c-c4ca53b7b35d\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.268707 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom\") pod \"4474daf6-43cd-4602-895c-c4ca53b7b35d\" (UID: \"4474daf6-43cd-4602-895c-c4ca53b7b35d\") " Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.273730 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4474daf6-43cd-4602-895c-c4ca53b7b35d" (UID: "4474daf6-43cd-4602-895c-c4ca53b7b35d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.275128 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr" (OuterVolumeSpecName: "kube-api-access-mc6wr") pod "4474daf6-43cd-4602-895c-c4ca53b7b35d" (UID: "4474daf6-43cd-4602-895c-c4ca53b7b35d"). InnerVolumeSpecName "kube-api-access-mc6wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.296272 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4474daf6-43cd-4602-895c-c4ca53b7b35d" (UID: "4474daf6-43cd-4602-895c-c4ca53b7b35d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.323755 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data" (OuterVolumeSpecName: "config-data") pod "4474daf6-43cd-4602-895c-c4ca53b7b35d" (UID: "4474daf6-43cd-4602-895c-c4ca53b7b35d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.370376 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc6wr\" (UniqueName: \"kubernetes.io/projected/4474daf6-43cd-4602-895c-c4ca53b7b35d-kube-api-access-mc6wr\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.370417 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.370426 4810 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.370437 4810 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4474daf6-43cd-4602-895c-c4ca53b7b35d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.768882 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-55547566f9-2tzph" event={"ID":"4474daf6-43cd-4602-895c-c4ca53b7b35d","Type":"ContainerDied","Data":"19e6acf552cc80d7718ee7888a9c389b0b7b04ed92e8baf33f267c22be53d095"} Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.768972 4810 scope.go:117] "RemoveContainer" containerID="f4d24a29f0dbedf6e5dd4d438a75dc5ccc37ab32dbb67c56f6d7205cc296aa0c" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.769094 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-55547566f9-2tzph" Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.787789 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:13:07 crc kubenswrapper[4810]: I1003 09:13:07.793356 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-55547566f9-2tzph"] Oct 03 09:13:09 crc kubenswrapper[4810]: I1003 09:13:09.312358 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" path="/var/lib/kubelet/pods/4474daf6-43cd-4602-895c-c4ca53b7b35d/volumes" Oct 03 09:13:29 crc kubenswrapper[4810]: I1003 09:13:29.601474 4810 scope.go:117] "RemoveContainer" containerID="15874275b3a2597d983a16f3a24a3d8a1f72f8482b5ea12528bb53e8dbd99e80" Oct 03 09:13:32 crc kubenswrapper[4810]: I1003 09:13:32.088464 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:32 crc kubenswrapper[4810]: I1003 09:13:32.089424 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.088775 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.089395 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.089447 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.090259 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.090328 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0" gracePeriod=600 Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.227050 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0" exitCode=0 Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.227136 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0"} Oct 03 09:14:02 crc kubenswrapper[4810]: I1003 09:14:02.227584 4810 scope.go:117] "RemoveContainer" containerID="3e28437355f0881d8a1807514d2481883e89a0021912f452e5acc6ffb76f89da" Oct 03 09:14:03 crc kubenswrapper[4810]: I1003 09:14:03.239493 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerStarted","Data":"53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6"} Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.050379 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hfgcf/must-gather-6kcxw"] Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051354 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerName="nova-scheduler-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051369 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerName="nova-scheduler-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051384 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051393 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051403 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" containerName="kube-state-metrics" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051409 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" containerName="kube-state-metrics" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051426 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051432 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051444 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="mysql-bootstrap" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051450 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="mysql-bootstrap" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051463 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ff0594-a3c0-4939-8113-d4611d0a53ce" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051469 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ff0594-a3c0-4939-8113-d4611d0a53ce" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051477 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051483 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051493 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f98119-f52a-4cea-a154-7380157f3720" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051500 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f98119-f52a-4cea-a154-7380157f3720" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051512 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="extract-utilities" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051521 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="extract-utilities" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051536 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051543 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051552 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051560 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051573 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051581 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051596 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051603 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051613 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051620 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051636 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="setup-container" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051644 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="setup-container" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051652 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051659 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051673 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="probe" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051682 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="probe" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051698 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e6d09e-815b-4c16-ae3e-79550f72b352" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051705 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e6d09e-815b-4c16-ae3e-79550f72b352" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051714 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051722 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051736 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051743 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051752 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051761 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051773 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerName="nova-cell0-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051781 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerName="nova-cell0-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051790 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be45a7-4886-405d-b10b-0519258715f6" containerName="keystone-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051797 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be45a7-4886-405d-b10b-0519258715f6" containerName="keystone-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051809 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-listener" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051818 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-listener" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051832 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051840 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051857 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051864 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051879 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051886 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051919 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051926 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051936 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="proxy-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051943 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="proxy-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051953 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1164f6f-490c-4ade-9824-9157b3b46be2" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051961 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1164f6f-490c-4ade-9824-9157b3b46be2" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051970 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051977 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.051988 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.051996 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052009 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052016 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052034 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052041 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052051 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="extract-content" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052059 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="extract-content" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052069 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03d4344e-bfcb-42d1-8bcb-6924d24ae77b" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052077 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d4344e-bfcb-42d1-8bcb-6924d24ae77b" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052087 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56de889-4796-4d3c-87b0-faff35387c26" containerName="memcached" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052094 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56de889-4796-4d3c-87b0-faff35387c26" containerName="memcached" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052107 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052114 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052123 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e600d090-3ebc-4552-a5d6-0c3de54c5de3" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052130 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e600d090-3ebc-4552-a5d6-0c3de54c5de3" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052140 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052147 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052157 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699b1f34-ccf2-4905-886d-10afe56a758f" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052163 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="699b1f34-ccf2-4905-886d-10afe56a758f" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052173 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-notifier" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052180 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-notifier" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052190 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="mysql-bootstrap" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052197 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="mysql-bootstrap" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052208 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052216 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052224 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052232 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052243 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="registry-server" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052250 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="registry-server" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052262 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-evaluator" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052269 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-evaluator" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052283 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052290 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052299 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052306 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052314 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052322 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052331 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-central-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052338 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-central-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052352 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerName="heat-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052359 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerName="heat-api" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052373 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="setup-container" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052381 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="setup-container" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052391 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="cinder-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052399 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="cinder-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052413 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052420 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052433 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="sg-core" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052440 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="sg-core" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052450 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="openstack-network-exporter" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052458 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="openstack-network-exporter" Oct 03 09:14:10 crc kubenswrapper[4810]: E1003 09:14:10.052471 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-notification-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052479 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-notification-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052656 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f98119-f52a-4cea-a154-7380157f3720" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052670 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e6d09e-815b-4c16-ae3e-79550f72b352" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052687 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052699 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052707 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="sg-core" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052720 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052730 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ba8eff-5520-4703-911a-992a05fc8bdc" containerName="nova-scheduler-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052745 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="03d4344e-bfcb-42d1-8bcb-6924d24ae77b" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052757 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052769 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052779 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052788 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d389d4-c420-497b-81aa-3277eaeb1a13" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052798 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="cinder-scheduler" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052809 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052819 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-evaluator" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052831 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4be45a7-4886-405d-b10b-0519258715f6" containerName="keystone-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052842 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="openstack-network-exporter" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052853 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="73897ece-d31a-4b17-aae9-1b781bc5dc44" containerName="registry-server" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052862 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-central-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052874 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="abef8580-e6ef-4e8a-887d-bdf45e9a5bbe" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052883 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="463a9578-5093-4812-a27c-c4d849a5ec67" containerName="nova-metadata-metadata" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052918 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecc84ff-bf6e-49da-b3e7-24f595f52cd9" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052929 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f62813-7b25-45b6-9942-a5e54b40a235" containerName="neutron-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052941 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="64def3d7-6342-432a-a0c2-b562b7514bca" containerName="kube-state-metrics" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052951 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2735e6-11ae-45cc-94fe-9628468a810a" containerName="ovn-northd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052961 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052973 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="4474daf6-43cd-4602-895c-c4ca53b7b35d" containerName="heat-engine" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.052996 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-notifier" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053018 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a54421e-d5ab-4e38-a6da-d90860a2b3b3" containerName="aodh-listener" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053049 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="699b1f34-ccf2-4905-886d-10afe56a758f" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053057 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053069 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d1d96d1-04f9-4dd6-aabc-5a55b291d47b" containerName="glance-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053078 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="ceilometer-notification-agent" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053086 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="990ef3ca-87e1-4b2b-8714-86263523425b" containerName="cinder-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053096 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb12a3c2-3c6e-437a-ac06-021b2c18c4ef" containerName="nova-cell1-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053104 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f95e65-4a0d-4f33-b977-abca4f20aeef" containerName="nova-cell0-conductor-conductor" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053115 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9436321-8243-4a4a-b07a-b14668a07f1f" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053123 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="14265138-5fbe-4187-928c-fbe84d832080" containerName="placement-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053134 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="36616a3d-b630-49fe-80a4-5e024f5f575f" containerName="barbican-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053145 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ff0594-a3c0-4939-8113-d4611d0a53ce" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053155 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b8a0d09-74be-4ada-a91c-90a30042cc32" containerName="proxy-httpd" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053163 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="e600d090-3ebc-4552-a5d6-0c3de54c5de3" containerName="mariadb-account-delete" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053178 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="6545022e-2717-4dda-801e-d88c8b037558" containerName="probe" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053187 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56de889-4796-4d3c-87b0-faff35387c26" containerName="memcached" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053196 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1164f6f-490c-4ade-9824-9157b3b46be2" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053207 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ad2114-1f17-4dee-ba01-428f6f8a33d8" containerName="adoption" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053217 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d48e153-4470-4082-8f04-319557247b18" containerName="nova-api-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053227 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="849f27b3-b111-4722-9e85-528f2fbed78d" containerName="rabbitmq" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053235 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="51261961-7cea-4ee8-8009-d0669f796caa" containerName="galera" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053250 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2aaa8b9-8b22-47de-9476-06746378f92f" containerName="horizon-log" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053262 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03f8837-7028-4fa8-b8e2-9e12a35037c5" containerName="heat-cfnapi" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.053274 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbd704a-5b8a-47e7-a5f9-7c6a20e01293" containerName="heat-api" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.054245 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.056856 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hfgcf"/"kube-root-ca.crt" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.057582 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hfgcf"/"openshift-service-ca.crt" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.058092 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hfgcf"/"default-dockercfg-wwr8q" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.073155 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hfgcf/must-gather-6kcxw"] Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.136018 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcg8\" (UniqueName: \"kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.136106 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.237169 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcg8\" (UniqueName: \"kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.237297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.237755 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.263248 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcg8\" (UniqueName: \"kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8\") pod \"must-gather-6kcxw\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.376224 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:14:10 crc kubenswrapper[4810]: I1003 09:14:10.824085 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hfgcf/must-gather-6kcxw"] Oct 03 09:14:11 crc kubenswrapper[4810]: I1003 09:14:11.312000 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" event={"ID":"1515b15c-c878-43e5-8048-aa5c72ac62a6","Type":"ContainerStarted","Data":"86b10a9a6abc31b7241f9b5c3892fcedd12a120cb75c263b24df9830fdbbe2d4"} Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.351695 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" event={"ID":"1515b15c-c878-43e5-8048-aa5c72ac62a6","Type":"ContainerStarted","Data":"2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0"} Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.352721 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" event={"ID":"1515b15c-c878-43e5-8048-aa5c72ac62a6","Type":"ContainerStarted","Data":"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6"} Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.375270 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" podStartSLOduration=1.399684921 podStartE2EDuration="5.375245646s" podCreationTimestamp="2025-10-03 09:14:10 +0000 UTC" firstStartedPulling="2025-10-03 09:14:10.827398252 +0000 UTC m=+8284.254648997" lastFinishedPulling="2025-10-03 09:14:14.802958987 +0000 UTC m=+8288.230209722" observedRunningTime="2025-10-03 09:14:15.366823841 +0000 UTC m=+8288.794074576" watchObservedRunningTime="2025-10-03 09:14:15.375245646 +0000 UTC m=+8288.802496371" Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.880191 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-pbp8n"] Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.881701 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.943690 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt7m\" (UniqueName: \"kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:15 crc kubenswrapper[4810]: I1003 09:14:15.943790 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.045291 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt7m\" (UniqueName: \"kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.045386 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.045492 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.065853 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt7m\" (UniqueName: \"kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m\") pod \"crc-debug-pbp8n\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.200681 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:14:16 crc kubenswrapper[4810]: I1003 09:14:16.360104 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" event={"ID":"ed61e53d-ad4d-47ec-ac35-88da1003d469","Type":"ContainerStarted","Data":"a11b8e4709f1c94170ef47663ba2d077ce5fe027fbe6b231764371157409d609"} Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.479116 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" event={"ID":"ed61e53d-ad4d-47ec-ac35-88da1003d469","Type":"ContainerStarted","Data":"296ed0750056f73d3fc06e4f5a26871e9848e7739cf2c173227fdf91ece092be"} Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.496219 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" podStartSLOduration=1.906907442 podStartE2EDuration="14.496203736s" podCreationTimestamp="2025-10-03 09:14:15 +0000 UTC" firstStartedPulling="2025-10-03 09:14:16.228995301 +0000 UTC m=+8289.656246036" lastFinishedPulling="2025-10-03 09:14:28.818291595 +0000 UTC m=+8302.245542330" observedRunningTime="2025-10-03 09:14:29.491421398 +0000 UTC m=+8302.918672133" watchObservedRunningTime="2025-10-03 09:14:29.496203736 +0000 UTC m=+8302.923454471" Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.722478 4810 scope.go:117] "RemoveContainer" containerID="43e7dc7eb5e00cbb5009d4a66ffb6aa6e9fffc55143b955e5c66ac08125d7298" Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.743182 4810 scope.go:117] "RemoveContainer" containerID="adb7a5c808b93d6656ba7be4b3adb1cef9ad5a4f1a8d95fd455fb9be3ec69058" Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.760486 4810 scope.go:117] "RemoveContainer" containerID="66f1ba2e75970a1aba16f1afb2a73d7784ddebbe5e4717cbfe405fc0b9fc68dd" Oct 03 09:14:29 crc kubenswrapper[4810]: I1003 09:14:29.794549 4810 scope.go:117] "RemoveContainer" containerID="35ebea1b4f71fc0c6c9c8ae4c6ed23990e2f042b27e1024559087da107695952" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.143207 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7"] Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.145085 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.147428 4810 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.147559 4810 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.156504 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7"] Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.227143 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.227256 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.227287 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvccs\" (UniqueName: \"kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.329158 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.329264 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.329297 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvccs\" (UniqueName: \"kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.330765 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.348046 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvccs\" (UniqueName: \"kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.349091 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume\") pod \"collect-profiles-29324715-pjdx7\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.481321 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:00 crc kubenswrapper[4810]: I1003 09:15:00.932296 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7"] Oct 03 09:15:01 crc kubenswrapper[4810]: I1003 09:15:01.737473 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" event={"ID":"1eeb3e87-3d26-4988-b456-f861d9cc85f8","Type":"ContainerStarted","Data":"7c0ad716444cf3340811e35fa2c4cce1ecb56f83d43011c7158ed11c4da07bd1"} Oct 03 09:15:01 crc kubenswrapper[4810]: I1003 09:15:01.737937 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" event={"ID":"1eeb3e87-3d26-4988-b456-f861d9cc85f8","Type":"ContainerStarted","Data":"4ef9d02e106a609b2c7bb6a46fa022d0aa7c821c3c80f5586e70aa9d5da2f631"} Oct 03 09:15:01 crc kubenswrapper[4810]: I1003 09:15:01.768256 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" podStartSLOduration=1.7682318160000001 podStartE2EDuration="1.768231816s" podCreationTimestamp="2025-10-03 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:15:01.758408714 +0000 UTC m=+8335.185659449" watchObservedRunningTime="2025-10-03 09:15:01.768231816 +0000 UTC m=+8335.195482551" Oct 03 09:15:02 crc kubenswrapper[4810]: I1003 09:15:02.748223 4810 generic.go:334] "Generic (PLEG): container finished" podID="1eeb3e87-3d26-4988-b456-f861d9cc85f8" containerID="7c0ad716444cf3340811e35fa2c4cce1ecb56f83d43011c7158ed11c4da07bd1" exitCode=0 Oct 03 09:15:02 crc kubenswrapper[4810]: I1003 09:15:02.748449 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" event={"ID":"1eeb3e87-3d26-4988-b456-f861d9cc85f8","Type":"ContainerDied","Data":"7c0ad716444cf3340811e35fa2c4cce1ecb56f83d43011c7158ed11c4da07bd1"} Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.055329 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.196817 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume\") pod \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.197058 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume\") pod \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.197139 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvccs\" (UniqueName: \"kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs\") pod \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\" (UID: \"1eeb3e87-3d26-4988-b456-f861d9cc85f8\") " Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.199023 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume" (OuterVolumeSpecName: "config-volume") pod "1eeb3e87-3d26-4988-b456-f861d9cc85f8" (UID: "1eeb3e87-3d26-4988-b456-f861d9cc85f8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.204141 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1eeb3e87-3d26-4988-b456-f861d9cc85f8" (UID: "1eeb3e87-3d26-4988-b456-f861d9cc85f8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.215749 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs" (OuterVolumeSpecName: "kube-api-access-vvccs") pod "1eeb3e87-3d26-4988-b456-f861d9cc85f8" (UID: "1eeb3e87-3d26-4988-b456-f861d9cc85f8"). InnerVolumeSpecName "kube-api-access-vvccs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.299380 4810 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1eeb3e87-3d26-4988-b456-f861d9cc85f8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.299420 4810 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1eeb3e87-3d26-4988-b456-f861d9cc85f8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.299431 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvccs\" (UniqueName: \"kubernetes.io/projected/1eeb3e87-3d26-4988-b456-f861d9cc85f8-kube-api-access-vvccs\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.767396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" event={"ID":"1eeb3e87-3d26-4988-b456-f861d9cc85f8","Type":"ContainerDied","Data":"4ef9d02e106a609b2c7bb6a46fa022d0aa7c821c3c80f5586e70aa9d5da2f631"} Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.767831 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ef9d02e106a609b2c7bb6a46fa022d0aa7c821c3c80f5586e70aa9d5da2f631" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.767426 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-pjdx7" Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.821011 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4"] Oct 03 09:15:04 crc kubenswrapper[4810]: I1003 09:15:04.826496 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-jfdf4"] Oct 03 09:15:05 crc kubenswrapper[4810]: I1003 09:15:05.311279 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e716951-1a48-42f1-b790-e047fba7b477" path="/var/lib/kubelet/pods/9e716951-1a48-42f1-b790-e047fba7b477/volumes" Oct 03 09:15:29 crc kubenswrapper[4810]: I1003 09:15:29.859615 4810 scope.go:117] "RemoveContainer" containerID="dbab8a7e512236638d4064d4a8006a3f143bbc0cfea49417d843d0140eb8cafb" Oct 03 09:15:29 crc kubenswrapper[4810]: I1003 09:15:29.884408 4810 scope.go:117] "RemoveContainer" containerID="a72f3af1a00931516d7f54475ffe568d59818e118dc990168c338c158fa9a828" Oct 03 09:15:29 crc kubenswrapper[4810]: I1003 09:15:29.915318 4810 scope.go:117] "RemoveContainer" containerID="d9f9bc46f0e813c021c190f7eba0554587ee3170bfe9298bcf82420f3c2a5429" Oct 03 09:15:29 crc kubenswrapper[4810]: I1003 09:15:29.987190 4810 scope.go:117] "RemoveContainer" containerID="6f8d1a8773316235134643735f64759cbb71cf6070e1e1be8f28f85cc3728356" Oct 03 09:16:02 crc kubenswrapper[4810]: I1003 09:16:02.088785 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:02 crc kubenswrapper[4810]: I1003 09:16:02.090328 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.327392 4810 generic.go:334] "Generic (PLEG): container finished" podID="ed61e53d-ad4d-47ec-ac35-88da1003d469" containerID="296ed0750056f73d3fc06e4f5a26871e9848e7739cf2c173227fdf91ece092be" exitCode=0 Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.327520 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" event={"ID":"ed61e53d-ad4d-47ec-ac35-88da1003d469","Type":"ContainerDied","Data":"296ed0750056f73d3fc06e4f5a26871e9848e7739cf2c173227fdf91ece092be"} Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.986624 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6gdwv"] Oct 03 09:16:16 crc kubenswrapper[4810]: E1003 09:16:16.987036 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eeb3e87-3d26-4988-b456-f861d9cc85f8" containerName="collect-profiles" Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.987060 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eeb3e87-3d26-4988-b456-f861d9cc85f8" containerName="collect-profiles" Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.987240 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eeb3e87-3d26-4988-b456-f861d9cc85f8" containerName="collect-profiles" Oct 03 09:16:16 crc kubenswrapper[4810]: I1003 09:16:16.988610 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.007025 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6gdwv"] Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.066291 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6fkj\" (UniqueName: \"kubernetes.io/projected/437239f6-563b-41e0-8895-3d1804c033f8-kube-api-access-t6fkj\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.066348 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-utilities\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.066365 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-catalog-content\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.167752 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6fkj\" (UniqueName: \"kubernetes.io/projected/437239f6-563b-41e0-8895-3d1804c033f8-kube-api-access-t6fkj\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.167814 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-utilities\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.167833 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-catalog-content\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.168385 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-utilities\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.168432 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/437239f6-563b-41e0-8895-3d1804c033f8-catalog-content\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.185430 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6fkj\" (UniqueName: \"kubernetes.io/projected/437239f6-563b-41e0-8895-3d1804c033f8-kube-api-access-t6fkj\") pod \"community-operators-6gdwv\" (UID: \"437239f6-563b-41e0-8895-3d1804c033f8\") " pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.310623 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.426775 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.456816 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-pbp8n"] Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.465519 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-pbp8n"] Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.573598 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host\") pod \"ed61e53d-ad4d-47ec-ac35-88da1003d469\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.574080 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppt7m\" (UniqueName: \"kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m\") pod \"ed61e53d-ad4d-47ec-ac35-88da1003d469\" (UID: \"ed61e53d-ad4d-47ec-ac35-88da1003d469\") " Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.573761 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host" (OuterVolumeSpecName: "host") pod "ed61e53d-ad4d-47ec-ac35-88da1003d469" (UID: "ed61e53d-ad4d-47ec-ac35-88da1003d469"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.574495 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed61e53d-ad4d-47ec-ac35-88da1003d469-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.596463 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m" (OuterVolumeSpecName: "kube-api-access-ppt7m") pod "ed61e53d-ad4d-47ec-ac35-88da1003d469" (UID: "ed61e53d-ad4d-47ec-ac35-88da1003d469"). InnerVolumeSpecName "kube-api-access-ppt7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.654779 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6gdwv"] Oct 03 09:16:17 crc kubenswrapper[4810]: W1003 09:16:17.661818 4810 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437239f6_563b_41e0_8895_3d1804c033f8.slice/crio-6027e9d7ef6ba19ba39b58589242424af8ac93ceeae893a3f274067b9eca6c3b WatchSource:0}: Error finding container 6027e9d7ef6ba19ba39b58589242424af8ac93ceeae893a3f274067b9eca6c3b: Status 404 returned error can't find the container with id 6027e9d7ef6ba19ba39b58589242424af8ac93ceeae893a3f274067b9eca6c3b Oct 03 09:16:17 crc kubenswrapper[4810]: I1003 09:16:17.675366 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppt7m\" (UniqueName: \"kubernetes.io/projected/ed61e53d-ad4d-47ec-ac35-88da1003d469-kube-api-access-ppt7m\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.362302 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11b8e4709f1c94170ef47663ba2d077ce5fe027fbe6b231764371157409d609" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.362396 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-pbp8n" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.365247 4810 generic.go:334] "Generic (PLEG): container finished" podID="437239f6-563b-41e0-8895-3d1804c033f8" containerID="680fcb930367ad951ed1d93a754d53c79a3e421379e0b49943dfd60d23cf5721" exitCode=0 Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.365314 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gdwv" event={"ID":"437239f6-563b-41e0-8895-3d1804c033f8","Type":"ContainerDied","Data":"680fcb930367ad951ed1d93a754d53c79a3e421379e0b49943dfd60d23cf5721"} Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.365359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gdwv" event={"ID":"437239f6-563b-41e0-8895-3d1804c033f8","Type":"ContainerStarted","Data":"6027e9d7ef6ba19ba39b58589242424af8ac93ceeae893a3f274067b9eca6c3b"} Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.636635 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-fwkmz"] Oct 03 09:16:18 crc kubenswrapper[4810]: E1003 09:16:18.637231 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed61e53d-ad4d-47ec-ac35-88da1003d469" containerName="container-00" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.637244 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed61e53d-ad4d-47ec-ac35-88da1003d469" containerName="container-00" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.637403 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed61e53d-ad4d-47ec-ac35-88da1003d469" containerName="container-00" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.637941 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.792691 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.792868 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtks2\" (UniqueName: \"kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.894915 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.894986 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.895030 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtks2\" (UniqueName: \"kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.914392 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtks2\" (UniqueName: \"kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2\") pod \"crc-debug-fwkmz\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:18 crc kubenswrapper[4810]: I1003 09:16:18.953956 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:19 crc kubenswrapper[4810]: I1003 09:16:19.311679 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed61e53d-ad4d-47ec-ac35-88da1003d469" path="/var/lib/kubelet/pods/ed61e53d-ad4d-47ec-ac35-88da1003d469/volumes" Oct 03 09:16:19 crc kubenswrapper[4810]: I1003 09:16:19.374359 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" event={"ID":"0d80e09e-1cc9-41ae-9455-34227a8a0bfc","Type":"ContainerStarted","Data":"8c6bc785c0b6ac6b1ce2ed16dd473625fabfe7fec5de3fe08daa91ec7bd073bb"} Oct 03 09:16:19 crc kubenswrapper[4810]: I1003 09:16:19.374410 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" event={"ID":"0d80e09e-1cc9-41ae-9455-34227a8a0bfc","Type":"ContainerStarted","Data":"fdaa53e8762dfa1611fe8734eca2d82a1aef87d6faafbf85bee902d20a8142a0"} Oct 03 09:16:19 crc kubenswrapper[4810]: I1003 09:16:19.390496 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" podStartSLOduration=1.390475693 podStartE2EDuration="1.390475693s" podCreationTimestamp="2025-10-03 09:16:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:16:19.388389908 +0000 UTC m=+8412.815640643" watchObservedRunningTime="2025-10-03 09:16:19.390475693 +0000 UTC m=+8412.817726428" Oct 03 09:16:20 crc kubenswrapper[4810]: I1003 09:16:20.387077 4810 generic.go:334] "Generic (PLEG): container finished" podID="0d80e09e-1cc9-41ae-9455-34227a8a0bfc" containerID="8c6bc785c0b6ac6b1ce2ed16dd473625fabfe7fec5de3fe08daa91ec7bd073bb" exitCode=0 Oct 03 09:16:20 crc kubenswrapper[4810]: I1003 09:16:20.387201 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" event={"ID":"0d80e09e-1cc9-41ae-9455-34227a8a0bfc","Type":"ContainerDied","Data":"8c6bc785c0b6ac6b1ce2ed16dd473625fabfe7fec5de3fe08daa91ec7bd073bb"} Oct 03 09:16:22 crc kubenswrapper[4810]: I1003 09:16:22.799073 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:22 crc kubenswrapper[4810]: I1003 09:16:22.957408 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host\") pod \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " Oct 03 09:16:22 crc kubenswrapper[4810]: I1003 09:16:22.957636 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtks2\" (UniqueName: \"kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2\") pod \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\" (UID: \"0d80e09e-1cc9-41ae-9455-34227a8a0bfc\") " Oct 03 09:16:22 crc kubenswrapper[4810]: I1003 09:16:22.958616 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host" (OuterVolumeSpecName: "host") pod "0d80e09e-1cc9-41ae-9455-34227a8a0bfc" (UID: "0d80e09e-1cc9-41ae-9455-34227a8a0bfc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:16:22 crc kubenswrapper[4810]: I1003 09:16:22.963664 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2" (OuterVolumeSpecName: "kube-api-access-gtks2") pod "0d80e09e-1cc9-41ae-9455-34227a8a0bfc" (UID: "0d80e09e-1cc9-41ae-9455-34227a8a0bfc"). InnerVolumeSpecName "kube-api-access-gtks2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.059862 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtks2\" (UniqueName: \"kubernetes.io/projected/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-kube-api-access-gtks2\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.060124 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d80e09e-1cc9-41ae-9455-34227a8a0bfc-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.411958 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" event={"ID":"0d80e09e-1cc9-41ae-9455-34227a8a0bfc","Type":"ContainerDied","Data":"fdaa53e8762dfa1611fe8734eca2d82a1aef87d6faafbf85bee902d20a8142a0"} Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.411999 4810 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdaa53e8762dfa1611fe8734eca2d82a1aef87d6faafbf85bee902d20a8142a0" Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.412973 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-fwkmz" Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.413995 4810 generic.go:334] "Generic (PLEG): container finished" podID="437239f6-563b-41e0-8895-3d1804c033f8" containerID="e3421e6a4c8847a9a6594a0e40b4617217260a38f5ff00fb6eca9bba1bd1332d" exitCode=0 Oct 03 09:16:23 crc kubenswrapper[4810]: I1003 09:16:23.414037 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gdwv" event={"ID":"437239f6-563b-41e0-8895-3d1804c033f8","Type":"ContainerDied","Data":"e3421e6a4c8847a9a6594a0e40b4617217260a38f5ff00fb6eca9bba1bd1332d"} Oct 03 09:16:25 crc kubenswrapper[4810]: I1003 09:16:25.428852 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6gdwv" event={"ID":"437239f6-563b-41e0-8895-3d1804c033f8","Type":"ContainerStarted","Data":"c92fbc45a13247cb886587f34716bbf858b508c7d65690b146470b801dda3339"} Oct 03 09:16:25 crc kubenswrapper[4810]: I1003 09:16:25.452673 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6gdwv" podStartSLOduration=3.557167073 podStartE2EDuration="9.452651009s" podCreationTimestamp="2025-10-03 09:16:16 +0000 UTC" firstStartedPulling="2025-10-03 09:16:18.368421285 +0000 UTC m=+8411.795672020" lastFinishedPulling="2025-10-03 09:16:24.263905221 +0000 UTC m=+8417.691155956" observedRunningTime="2025-10-03 09:16:25.446924547 +0000 UTC m=+8418.874175282" watchObservedRunningTime="2025-10-03 09:16:25.452651009 +0000 UTC m=+8418.879901744" Oct 03 09:16:27 crc kubenswrapper[4810]: I1003 09:16:27.312405 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:27 crc kubenswrapper[4810]: I1003 09:16:27.312736 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:27 crc kubenswrapper[4810]: I1003 09:16:27.364713 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:28 crc kubenswrapper[4810]: I1003 09:16:28.705451 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-fwkmz"] Oct 03 09:16:28 crc kubenswrapper[4810]: I1003 09:16:28.710293 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-fwkmz"] Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.324941 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d80e09e-1cc9-41ae-9455-34227a8a0bfc" path="/var/lib/kubelet/pods/0d80e09e-1cc9-41ae-9455-34227a8a0bfc/volumes" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.891038 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-jldgk"] Oct 03 09:16:29 crc kubenswrapper[4810]: E1003 09:16:29.891726 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d80e09e-1cc9-41ae-9455-34227a8a0bfc" containerName="container-00" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.891753 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d80e09e-1cc9-41ae-9455-34227a8a0bfc" containerName="container-00" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.891964 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d80e09e-1cc9-41ae-9455-34227a8a0bfc" containerName="container-00" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.892568 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.944503 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfmq\" (UniqueName: \"kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:29 crc kubenswrapper[4810]: I1003 09:16:29.944787 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.046316 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.046492 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.046519 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfmq\" (UniqueName: \"kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.065887 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfmq\" (UniqueName: \"kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq\") pod \"crc-debug-jldgk\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.210986 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.471422 4810 generic.go:334] "Generic (PLEG): container finished" podID="353c554b-572c-4c49-adb1-73cbe8c5bd10" containerID="ffc6663df273ca3fdf5c6eb7a4825893bbd12a83709e471df3705746c2ff3fc1" exitCode=0 Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.471499 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" event={"ID":"353c554b-572c-4c49-adb1-73cbe8c5bd10","Type":"ContainerDied","Data":"ffc6663df273ca3fdf5c6eb7a4825893bbd12a83709e471df3705746c2ff3fc1"} Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.471700 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" event={"ID":"353c554b-572c-4c49-adb1-73cbe8c5bd10","Type":"ContainerStarted","Data":"0c635648948efce6cc0171ef3bb19f28efee8aa4c6856e8520857f1afd6ebed5"} Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.502157 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-jldgk"] Oct 03 09:16:30 crc kubenswrapper[4810]: I1003 09:16:30.506475 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hfgcf/crc-debug-jldgk"] Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.556828 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.578467 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host\") pod \"353c554b-572c-4c49-adb1-73cbe8c5bd10\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.578600 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host" (OuterVolumeSpecName: "host") pod "353c554b-572c-4c49-adb1-73cbe8c5bd10" (UID: "353c554b-572c-4c49-adb1-73cbe8c5bd10"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.578638 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnfmq\" (UniqueName: \"kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq\") pod \"353c554b-572c-4c49-adb1-73cbe8c5bd10\" (UID: \"353c554b-572c-4c49-adb1-73cbe8c5bd10\") " Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.579013 4810 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/353c554b-572c-4c49-adb1-73cbe8c5bd10-host\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.584166 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq" (OuterVolumeSpecName: "kube-api-access-rnfmq") pod "353c554b-572c-4c49-adb1-73cbe8c5bd10" (UID: "353c554b-572c-4c49-adb1-73cbe8c5bd10"). InnerVolumeSpecName "kube-api-access-rnfmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:31 crc kubenswrapper[4810]: I1003 09:16:31.680816 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnfmq\" (UniqueName: \"kubernetes.io/projected/353c554b-572c-4c49-adb1-73cbe8c5bd10-kube-api-access-rnfmq\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.088530 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.088589 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.489545 4810 scope.go:117] "RemoveContainer" containerID="ffc6663df273ca3fdf5c6eb7a4825893bbd12a83709e471df3705746c2ff3fc1" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.489556 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/crc-debug-jldgk" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.536456 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_deb66def-1afa-46cc-bacf-2f6d3f8ed245/openstack-network-exporter/0.log" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.717635 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_deb66def-1afa-46cc-bacf-2f6d3f8ed245/ovsdbserver-nb/0.log" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.893499 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8d17221-136d-42b3-9e68-f5eb5b026ae2/openstack-network-exporter/0.log" Oct 03 09:16:32 crc kubenswrapper[4810]: I1003 09:16:32.972933 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8d17221-136d-42b3-9e68-f5eb5b026ae2/ovsdbserver-nb/0.log" Oct 03 09:16:33 crc kubenswrapper[4810]: I1003 09:16:33.101483 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d41676c8-318c-4033-a53e-c383f4f18fd7/openstack-network-exporter/0.log" Oct 03 09:16:33 crc kubenswrapper[4810]: I1003 09:16:33.175775 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d41676c8-318c-4033-a53e-c383f4f18fd7/ovsdbserver-sb/0.log" Oct 03 09:16:33 crc kubenswrapper[4810]: I1003 09:16:33.311107 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353c554b-572c-4c49-adb1-73cbe8c5bd10" path="/var/lib/kubelet/pods/353c554b-572c-4c49-adb1-73cbe8c5bd10/volumes" Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.360935 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6gdwv" Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.422500 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6gdwv"] Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.466236 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.466728 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k67nt" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="registry-server" containerID="cri-o://ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3" gracePeriod=2 Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.937608 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67nt" Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.973184 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7b2p\" (UniqueName: \"kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p\") pod \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.973258 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities\") pod \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.973490 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content\") pod \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\" (UID: \"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68\") " Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.974109 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities" (OuterVolumeSpecName: "utilities") pod "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" (UID: "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:16:37 crc kubenswrapper[4810]: I1003 09:16:37.979038 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p" (OuterVolumeSpecName: "kube-api-access-m7b2p") pod "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" (UID: "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68"). InnerVolumeSpecName "kube-api-access-m7b2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.075077 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7b2p\" (UniqueName: \"kubernetes.io/projected/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-kube-api-access-m7b2p\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.075112 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.342415 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" (UID: "5e776c8b-d2f2-41b6-a795-dac7bf2c4a68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.378536 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.539755 4810 generic.go:334] "Generic (PLEG): container finished" podID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerID="ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3" exitCode=0 Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.539802 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerDied","Data":"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3"} Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.539840 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k67nt" event={"ID":"5e776c8b-d2f2-41b6-a795-dac7bf2c4a68","Type":"ContainerDied","Data":"8c6971ea82f27a55122ea62062765d7d56f305bf07dd87666694e163491276ad"} Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.539850 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k67nt" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.539857 4810 scope.go:117] "RemoveContainer" containerID="ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.576800 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.576982 4810 scope.go:117] "RemoveContainer" containerID="d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.587130 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k67nt"] Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.602503 4810 scope.go:117] "RemoveContainer" containerID="e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.631300 4810 scope.go:117] "RemoveContainer" containerID="ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3" Oct 03 09:16:38 crc kubenswrapper[4810]: E1003 09:16:38.631695 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3\": container with ID starting with ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3 not found: ID does not exist" containerID="ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.631726 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3"} err="failed to get container status \"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3\": rpc error: code = NotFound desc = could not find container \"ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3\": container with ID starting with ee753f5b97138928ee8844a3bccd516454424ec5be6ac2dd907b904dc68f02e3 not found: ID does not exist" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.631748 4810 scope.go:117] "RemoveContainer" containerID="d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650" Oct 03 09:16:38 crc kubenswrapper[4810]: E1003 09:16:38.632039 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650\": container with ID starting with d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650 not found: ID does not exist" containerID="d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.632072 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650"} err="failed to get container status \"d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650\": rpc error: code = NotFound desc = could not find container \"d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650\": container with ID starting with d0877d472e8d3441801b6d860b7fbf71f2e880a753e3444b35ddff9a27d7c650 not found: ID does not exist" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.632091 4810 scope.go:117] "RemoveContainer" containerID="e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6" Oct 03 09:16:38 crc kubenswrapper[4810]: E1003 09:16:38.632339 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6\": container with ID starting with e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6 not found: ID does not exist" containerID="e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6" Oct 03 09:16:38 crc kubenswrapper[4810]: I1003 09:16:38.632369 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6"} err="failed to get container status \"e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6\": rpc error: code = NotFound desc = could not find container \"e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6\": container with ID starting with e07d35cdaf6f38d6e92cb72703d39879fb3489b04c9ad1b15ebf887c6a7d4ef6 not found: ID does not exist" Oct 03 09:16:39 crc kubenswrapper[4810]: I1003 09:16:39.311512 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" path="/var/lib/kubelet/pods/5e776c8b-d2f2-41b6-a795-dac7bf2c4a68/volumes" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.453289 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/util/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.654680 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/pull/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.661524 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/pull/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.662053 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/util/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.823655 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/pull/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.824973 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/util/0.log" Oct 03 09:16:44 crc kubenswrapper[4810]: I1003 09:16:44.845952 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_00fdfcf229b8ff1fdb3fe92d2c04f3dec332acb95c91f2ef3a9c95bafemnbtt_b9f7779b-bb8c-4383-a9fb-3bfe5c38784c/extract/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.030252 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-5d7tm_4686e0fc-c208-4a38-a7ae-33adc6123d0d/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.062580 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-jlqcq_701c6350-6581-453d-abc4-728bff24f1a5/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.136467 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6d6d64fdcf-5d7tm_4686e0fc-c208-4a38-a7ae-33adc6123d0d/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.281154 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8686fd99f7-jlqcq_701c6350-6581-453d-abc4-728bff24f1a5/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.287176 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-nthwx_911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.340359 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-58d86cd59d-nthwx_911421e2-0d0c-4e76-b9a7-4cc5cc1ef41d/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.476729 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-kln57_e910dcfe-cedc-4a55-ad55-39adb2422c48/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.636467 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-d785ddfd5-kln57_e910dcfe-cedc-4a55-ad55-39adb2422c48/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.671636 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-nf2kl_d812ca0a-067a-4c91-a460-d340ef72d051/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.748130 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5ffbdb7ddf-nf2kl_d812ca0a-067a-4c91-a460-d340ef72d051/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.869471 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-qxv5t_84c853db-41b6-4013-b6a6-b9f6c3fa74e3/kube-rbac-proxy/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.883791 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-586b66cf4f-qxv5t_84c853db-41b6-4013-b6a6-b9f6c3fa74e3/manager/0.log" Oct 03 09:16:45 crc kubenswrapper[4810]: I1003 09:16:45.997396 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-wqn2b_42afdadf-08ba-4196-a815-a4fc0acf2181/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.219144 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-hv9vj_7ed3bd35-9f00-40d3-9ed0-a111c4131b11/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.252061 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-59b5fc9845-hv9vj_7ed3bd35-9f00-40d3-9ed0-a111c4131b11/manager/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.353745 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7c9978f67-wqn2b_42afdadf-08ba-4196-a815-a4fc0acf2181/manager/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.442480 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-zvpws_21a68e29-471e-428d-9491-ae4b33a01e8a/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.550564 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c9969c6c6-zvpws_21a68e29-471e-428d-9491-ae4b33a01e8a/manager/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.587429 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-47p2c_6c644f74-4957-40ad-8286-b172b802a323/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.648106 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-66fdd975d9-47p2c_6c644f74-4957-40ad-8286-b172b802a323/manager/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.755627 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-msm2p_9aeddbe0-0fe7-451c-afaf-bd0074505142/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.809098 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-696ff4bcdd-msm2p_9aeddbe0-0fe7-451c-afaf-bd0074505142/manager/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.942655 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-6qbx2_6ef1249c-27f8-4cc0-8134-690b6b8773d1/kube-rbac-proxy/0.log" Oct 03 09:16:46 crc kubenswrapper[4810]: I1003 09:16:46.995313 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-549fb68678-6qbx2_6ef1249c-27f8-4cc0-8134-690b6b8773d1/manager/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.051375 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-cj2zz_5fa60d72-b0e4-4a27-b2af-d228cd9db5da/kube-rbac-proxy/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.231475 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-9qkqx_b1578049-8763-4c26-b149-8497d94da92e/kube-rbac-proxy/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.274379 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-b4444585c-9qkqx_b1578049-8763-4c26-b149-8497d94da92e/manager/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.323294 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5b45478b88-cj2zz_5fa60d72-b0e4-4a27-b2af-d228cd9db5da/manager/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.425292 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48_aaf10555-293e-4c2b-baea-ffcdd4eeb046/kube-rbac-proxy/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.468415 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7fb4f565cd5xb48_aaf10555-293e-4c2b-baea-ffcdd4eeb046/manager/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.616951 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54df7874c5-l442f_9ac05823-a077-4e2a-8bf8-b5e9a454bf03/kube-rbac-proxy/0.log" Oct 03 09:16:47 crc kubenswrapper[4810]: I1003 09:16:47.837028 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86f8d7b75f-wl9x5_6526f954-e830-42ea-b0a1-6121ff0e32c1/kube-rbac-proxy/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.014103 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-86f8d7b75f-wl9x5_6526f954-e830-42ea-b0a1-6121ff0e32c1/operator/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.190070 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kkbw6_bd51530d-735f-4124-bb3e-4c3967f64a31/registry-server/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.263510 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-gnvp7_2514440b-b998-4078-834e-642c3bcae80f/kube-rbac-proxy/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.391123 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-855d7949fc-gnvp7_2514440b-b998-4078-834e-642c3bcae80f/manager/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.393290 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-tnrxp_b842fd5e-6125-40cf-b729-947e54309c87/kube-rbac-proxy/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.517887 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-ccbfcb8c-tnrxp_b842fd5e-6125-40cf-b729-947e54309c87/manager/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.615454 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-bcntr_52bc186b-754c-4fc1-982d-435c345718a7/operator/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.715728 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-7dv7x_1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63/kube-rbac-proxy/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.893073 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-vf8h7_368ec6b5-ba12-467e-ab4c-d46a83c31483/kube-rbac-proxy/0.log" Oct 03 09:16:48 crc kubenswrapper[4810]: I1003 09:16:48.893915 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-76d5577b-7dv7x_1a5d562f-6fe0-4bc9-a39a-7f7d8e9c1b63/manager/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.115555 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-hmtgk_d6c2b423-4398-416f-bb17-70a8eb814964/kube-rbac-proxy/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.131626 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6bb6dcddc-hmtgk_d6c2b423-4398-416f-bb17-70a8eb814964/manager/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.160459 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5ffb97cddf-vf8h7_368ec6b5-ba12-467e-ab4c-d46a83c31483/manager/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.364171 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-4hcbj_9ab8a530-fefb-477b-85f7-6f716f684292/kube-rbac-proxy/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.391842 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5595cf6c95-4hcbj_9ab8a530-fefb-477b-85f7-6f716f684292/manager/0.log" Oct 03 09:16:49 crc kubenswrapper[4810]: I1003 09:16:49.725492 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-54df7874c5-l442f_9ac05823-a077-4e2a-8bf8-b5e9a454bf03/manager/0.log" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.088723 4810 patch_prober.go:28] interesting pod/machine-config-daemon-z8f25 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.089203 4810 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.089240 4810 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.089784 4810 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6"} pod="openshift-machine-config-operator/machine-config-daemon-z8f25" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.089826 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerName="machine-config-daemon" containerID="cri-o://53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" gracePeriod=600 Oct 03 09:17:02 crc kubenswrapper[4810]: E1003 09:17:02.208561 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.710432 4810 generic.go:334] "Generic (PLEG): container finished" podID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" exitCode=0 Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.711053 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" event={"ID":"e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930","Type":"ContainerDied","Data":"53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6"} Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.711169 4810 scope.go:117] "RemoveContainer" containerID="77c9ab96bca0ae7cc385f2151aa6af6a5bcc6198cc2b591d92de881c34829df0" Oct 03 09:17:02 crc kubenswrapper[4810]: I1003 09:17:02.711927 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:17:02 crc kubenswrapper[4810]: E1003 09:17:02.712271 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.420288 4810 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-2" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" containerID="cri-o://48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64" gracePeriod=300 Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.421380 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-2" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" containerID="cri-o://48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64" gracePeriod=2 Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.430130 4810 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-2" message=< Oct 03 09:17:03 crc kubenswrapper[4810]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: > Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.430377 4810 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: > pod="openstack/ovsdbserver-nb-2" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" containerID="cri-o://48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.469717 4810 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-nb-1" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" containerID="cri-o://448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40" gracePeriod=300 Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.469802 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-1" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" containerID="cri-o://448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40" gracePeriod=2 Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.485138 4810 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-nb" pod="openstack/ovsdbserver-nb-1" message=< Oct 03 09:17:03 crc kubenswrapper[4810]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: > Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.485335 4810 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=nb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnnb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ nb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-nb-1 != \o\v\s\d\b\s\e\r\v\e\r\-\n\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnnb_db.ctl cluster/leave OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnnb_db.ctl cluster/status OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: > pod="openstack/ovsdbserver-nb-1" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" containerID="cri-o://448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.630131 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jdh7s_d564c64d-4755-40aa-967d-8fb49704ef10/control-plane-machine-set-operator/0.log" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.735606 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8d17221-136d-42b3-9e68-f5eb5b026ae2/ovsdbserver-nb/0.log" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.735653 4810 generic.go:334] "Generic (PLEG): container finished" podID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerID="48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64" exitCode=143 Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.735704 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerDied","Data":"48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64"} Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.744646 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_deb66def-1afa-46cc-bacf-2f6d3f8ed245/ovsdbserver-nb/0.log" Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.744686 4810 generic.go:334] "Generic (PLEG): container finished" podID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerID="448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40" exitCode=143 Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.744709 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerDied","Data":"448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40"} Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.775968 4810 kuberuntime_container.go:700] "PreStop hook not completed in grace period" pod="openstack/ovsdbserver-sb-2" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" containerID="cri-o://6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" gracePeriod=300 Oct 03 09:17:03 crc kubenswrapper[4810]: I1003 09:17:03.776046 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-2" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" containerID="cri-o://6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" gracePeriod=2 Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.792582 4810 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=sb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ sb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: > execCommand=["/usr/local/bin/container-scripts/cleanup.sh"] containerName="ovsdbserver-sb" pod="openstack/ovsdbserver-sb-2" message=< Oct 03 09:17:03 crc kubenswrapper[4810]: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=sb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ sb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: > Oct 03 09:17:03 crc kubenswrapper[4810]: E1003 09:17:03.792789 4810 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 03 09:17:03 crc kubenswrapper[4810]: command '/usr/local/bin/container-scripts/cleanup.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/cleanup.sh Oct 03 09:17:03 crc kubenswrapper[4810]: + source /usr/local/bin/container-scripts/functions Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_TYPE=sb Oct 03 09:17:03 crc kubenswrapper[4810]: ++ DB_FILE=/etc/ovn/ovnsb_db.db Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Northbound Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ sb == \s\b ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + DB_NAME=OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ hostname Oct 03 09:17:03 crc kubenswrapper[4810]: + [[ ovsdbserver-sb-2 != \o\v\s\d\b\s\e\r\v\e\r\-\s\b\-\0 ]] Oct 03 09:17:03 crc kubenswrapper[4810]: + ovs-appctl -t /tmp/ovnsb_db.ctl cluster/leave OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:03 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:03 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:03 crc kubenswrapper[4810]: + true Oct 03 09:17:03 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:03 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:03 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:03 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: + true Oct 03 09:17:04 crc kubenswrapper[4810]: ++ ovs-appctl -t /tmp/ovnsb_db.ctl cluster/status OVN_Southbound Oct 03 09:17:04 crc kubenswrapper[4810]: ++ grep Status: Oct 03 09:17:04 crc kubenswrapper[4810]: ++ awk -e '{print $2}' Oct 03 09:17:04 crc kubenswrapper[4810]: + STATUS=leaving Oct 03 09:17:04 crc kubenswrapper[4810]: + '[' -z leaving -o xleaving = 'xleft cluster' ']' Oct 03 09:17:04 crc kubenswrapper[4810]: + sleep 1 Oct 03 09:17:04 crc kubenswrapper[4810]: > pod="openstack/ovsdbserver-sb-2" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" containerID="cri-o://6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:03.839828 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tk6p6_5a21c48a-596a-409d-8021-1425828a8a76/kube-rbac-proxy/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:03.858373 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tk6p6_5a21c48a-596a-409d-8021-1425828a8a76/machine-api-operator/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.262276 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc is running failed: container process not found" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.262759 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc is running failed: container process not found" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.265666 4810 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc is running failed: container process not found" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.265744 4810 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-2" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.268993 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8d17221-136d-42b3-9e68-f5eb5b026ae2/ovsdbserver-nb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.269065 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.297921 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d41676c8-318c-4033-a53e-c383f4f18fd7/ovsdbserver-sb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.298001 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.304221 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_deb66def-1afa-46cc-bacf-2f6d3f8ed245/ovsdbserver-nb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.304297 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.355514 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb64m\" (UniqueName: \"kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.355570 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.355605 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.355648 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.355701 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356452 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356563 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356613 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356644 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356677 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356704 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356733 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.356773 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357213 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts" (OuterVolumeSpecName: "scripts") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357262 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config" (OuterVolumeSpecName: "config") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357286 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts" (OuterVolumeSpecName: "scripts") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357611 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357647 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config" (OuterVolumeSpecName: "config") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357658 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357686 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357731 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbzd\" (UniqueName: \"kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357776 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357798 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.357815 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.358742 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.360105 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.360638 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.360687 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.360716 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gxws\" (UniqueName: \"kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws\") pod \"d41676c8-318c-4033-a53e-c383f4f18fd7\" (UID: \"d41676c8-318c-4033-a53e-c383f4f18fd7\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.361277 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts" (OuterVolumeSpecName: "scripts") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.363032 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config" (OuterVolumeSpecName: "config") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366112 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") pod \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\" (UID: \"deb66def-1afa-46cc-bacf-2f6d3f8ed245\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366157 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs\") pod \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\" (UID: \"f8d17221-136d-42b3-9e68-f5eb5b026ae2\") " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366766 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366793 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366806 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366817 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d41676c8-318c-4033-a53e-c383f4f18fd7-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366828 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366842 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366853 4810 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8d17221-136d-42b3-9e68-f5eb5b026ae2-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366864 4810 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/deb66def-1afa-46cc-bacf-2f6d3f8ed245-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.366875 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.369514 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws" (OuterVolumeSpecName: "kube-api-access-7gxws") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "kube-api-access-7gxws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.370044 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m" (OuterVolumeSpecName: "kube-api-access-vb64m") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "kube-api-access-vb64m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.377431 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd" (OuterVolumeSpecName: "kube-api-access-cmbzd") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "kube-api-access-cmbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.381208 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.381251 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.384269 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.387082 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.389210 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.397751 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.428864 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.430112 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.434507 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "d41676c8-318c-4033-a53e-c383f4f18fd7" (UID: "d41676c8-318c-4033-a53e-c383f4f18fd7"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.434827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.435873 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f8d17221-136d-42b3-9e68-f5eb5b026ae2" (UID: "f8d17221-136d-42b3-9e68-f5eb5b026ae2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.438827 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "deb66def-1afa-46cc-bacf-2f6d3f8ed245" (UID: "deb66def-1afa-46cc-bacf-2f6d3f8ed245"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468660 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468701 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468713 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468759 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") on node \"crc\" " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468773 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468786 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbzd\" (UniqueName: \"kubernetes.io/projected/deb66def-1afa-46cc-bacf-2f6d3f8ed245-kube-api-access-cmbzd\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468798 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d41676c8-318c-4033-a53e-c383f4f18fd7-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468810 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468828 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") on node \"crc\" " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468842 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gxws\" (UniqueName: \"kubernetes.io/projected/d41676c8-318c-4033-a53e-c383f4f18fd7-kube-api-access-7gxws\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468861 4810 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") on node \"crc\" " Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468873 4810 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8d17221-136d-42b3-9e68-f5eb5b026ae2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468886 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb64m\" (UniqueName: \"kubernetes.io/projected/f8d17221-136d-42b3-9e68-f5eb5b026ae2-kube-api-access-vb64m\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468914 4810 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.468926 4810 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/deb66def-1afa-46cc-bacf-2f6d3f8ed245-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.493918 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.494200 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4") on node "crc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.494388 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.494611 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6") on node "crc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.512786 4810 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.512981 4810 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3") on node "crc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.570444 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a129b466-bb67-4ce4-85a0-4cd71e28afc6\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.570484 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5e792f98-036e-4de5-b88a-bf0b7bd900a4\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.570494 4810 reconciler_common.go:293] "Volume detached for volume \"pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dfbe49e9-55c6-499d-9e74-0238ad7182e3\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.755379 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8d17221-136d-42b3-9e68-f5eb5b026ae2/ovsdbserver-nb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.755458 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8d17221-136d-42b3-9e68-f5eb5b026ae2","Type":"ContainerDied","Data":"5acfcf4c9e7728e5a2cd0c1df052b3599672e216953a30120cfe60feb2075668"} Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.755500 4810 scope.go:117] "RemoveContainer" containerID="a3d3c9e0db1422a18299e83c0a317086eace0fbe45d5f64b129666d18947412f" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.755596 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.759658 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_d41676c8-318c-4033-a53e-c383f4f18fd7/ovsdbserver-sb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.759698 4810 generic.go:334] "Generic (PLEG): container finished" podID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" exitCode=143 Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.759753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerDied","Data":"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc"} Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.759777 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"d41676c8-318c-4033-a53e-c383f4f18fd7","Type":"ContainerDied","Data":"5dae80ce7a83165b8d3275fddb9ca028f74cb3e827dba425ed4db81bb9412881"} Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.759846 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.765383 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_deb66def-1afa-46cc-bacf-2f6d3f8ed245/ovsdbserver-nb/0.log" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.765434 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"deb66def-1afa-46cc-bacf-2f6d3f8ed245","Type":"ContainerDied","Data":"70739d00dabfb16e5084c88aa8c6304bb7a5a67d4f688d37017c1e9fd67bf1b4"} Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.765542 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.780044 4810 scope.go:117] "RemoveContainer" containerID="48c4bc39465880c3fd3ff215e7973d374f3ac5c9112df09a363894883ff9fb64" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.793404 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.798788 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.807393 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.812731 4810 scope.go:117] "RemoveContainer" containerID="1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.814964 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.822192 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.827331 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.831739 4810 scope.go:117] "RemoveContainer" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.848938 4810 scope.go:117] "RemoveContainer" containerID="1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897" Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.849341 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897\": container with ID starting with 1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897 not found: ID does not exist" containerID="1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.849373 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897"} err="failed to get container status \"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897\": rpc error: code = NotFound desc = could not find container \"1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897\": container with ID starting with 1bd10737614b0d4505a9211f5fdb92b66da745dc530c46edc63629a0b557d897 not found: ID does not exist" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.849394 4810 scope.go:117] "RemoveContainer" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" Oct 03 09:17:04 crc kubenswrapper[4810]: E1003 09:17:04.849592 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc\": container with ID starting with 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc not found: ID does not exist" containerID="6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.849617 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc"} err="failed to get container status \"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc\": rpc error: code = NotFound desc = could not find container \"6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc\": container with ID starting with 6a2308089754012a0d926b6951c3bd0e3e5f81c20c41c3aafcb29ae66c3dcfbc not found: ID does not exist" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.849631 4810 scope.go:117] "RemoveContainer" containerID="9e8625267188d3164afef69fbc3a4e9502bd91a19d31086839f399a5a43e1daa" Oct 03 09:17:04 crc kubenswrapper[4810]: I1003 09:17:04.864763 4810 scope.go:117] "RemoveContainer" containerID="448f08b3dc3c0ff8ec372ea2140895a01da8a963041f286cc742e69bcb16dc40" Oct 03 09:17:05 crc kubenswrapper[4810]: I1003 09:17:05.312874 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" path="/var/lib/kubelet/pods/d41676c8-318c-4033-a53e-c383f4f18fd7/volumes" Oct 03 09:17:05 crc kubenswrapper[4810]: I1003 09:17:05.313721 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" path="/var/lib/kubelet/pods/deb66def-1afa-46cc-bacf-2f6d3f8ed245/volumes" Oct 03 09:17:05 crc kubenswrapper[4810]: I1003 09:17:05.315054 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" path="/var/lib/kubelet/pods/f8d17221-136d-42b3-9e68-f5eb5b026ae2/volumes" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.254455 4810 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256757 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353c554b-572c-4c49-adb1-73cbe8c5bd10" containerName="container-00" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256801 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="353c554b-572c-4c49-adb1-73cbe8c5bd10" containerName="container-00" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256818 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256825 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256855 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256863 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256910 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256919 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256939 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256947 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256963 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256971 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.256985 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="extract-content" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.256993 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="extract-content" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.257001 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="extract-utilities" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257010 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="extract-utilities" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.257022 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257030 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" Oct 03 09:17:13 crc kubenswrapper[4810]: E1003 09:17:13.257045 4810 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="registry-server" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257053 4810 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="registry-server" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257326 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257343 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257359 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257372 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d17221-136d-42b3-9e68-f5eb5b026ae2" containerName="ovsdbserver-nb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257387 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e776c8b-d2f2-41b6-a795-dac7bf2c4a68" containerName="registry-server" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257396 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="d41676c8-318c-4033-a53e-c383f4f18fd7" containerName="ovsdbserver-sb" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257405 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb66def-1afa-46cc-bacf-2f6d3f8ed245" containerName="openstack-network-exporter" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.257416 4810 memory_manager.go:354] "RemoveStaleState removing state" podUID="353c554b-572c-4c49-adb1-73cbe8c5bd10" containerName="container-00" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.258499 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.261925 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.389778 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.390471 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbmd\" (UniqueName: \"kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.390589 4810 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.492918 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbmd\" (UniqueName: \"kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.493005 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.493109 4810 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.493683 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.493727 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.515268 4810 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbmd\" (UniqueName: \"kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd\") pod \"redhat-marketplace-x6lpj\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:13 crc kubenswrapper[4810]: I1003 09:17:13.622460 4810 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:14 crc kubenswrapper[4810]: I1003 09:17:14.103189 4810 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:14 crc kubenswrapper[4810]: I1003 09:17:14.844418 4810 generic.go:334] "Generic (PLEG): container finished" podID="aef3b216-6644-41b9-97ea-96ca1d07fe56" containerID="224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63" exitCode=0 Oct 03 09:17:14 crc kubenswrapper[4810]: I1003 09:17:14.844565 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerDied","Data":"224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63"} Oct 03 09:17:14 crc kubenswrapper[4810]: I1003 09:17:14.844753 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerStarted","Data":"4d21ea985f9f6038ff8b738118e40c835a3d6c01dea8591795a308d0d81d609a"} Oct 03 09:17:14 crc kubenswrapper[4810]: I1003 09:17:14.846491 4810 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:17:15 crc kubenswrapper[4810]: I1003 09:17:15.303205 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:17:15 crc kubenswrapper[4810]: E1003 09:17:15.303392 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:15 crc kubenswrapper[4810]: I1003 09:17:15.308403 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-l99g8_44861968-3c3b-4288-b1d2-0da6e2d74cc4/cert-manager-controller/0.log" Oct 03 09:17:15 crc kubenswrapper[4810]: I1003 09:17:15.505575 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-4w69l_96ab5737-7fd2-43bf-8f7c-b5b10990c2b5/cert-manager-cainjector/0.log" Oct 03 09:17:15 crc kubenswrapper[4810]: I1003 09:17:15.508298 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-4lszr_eb52d59d-d019-4770-94cf-212c63c37fc6/cert-manager-webhook/0.log" Oct 03 09:17:15 crc kubenswrapper[4810]: I1003 09:17:15.852780 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerStarted","Data":"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15"} Oct 03 09:17:16 crc kubenswrapper[4810]: I1003 09:17:16.861621 4810 generic.go:334] "Generic (PLEG): container finished" podID="aef3b216-6644-41b9-97ea-96ca1d07fe56" containerID="29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15" exitCode=0 Oct 03 09:17:16 crc kubenswrapper[4810]: I1003 09:17:16.861671 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerDied","Data":"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15"} Oct 03 09:17:17 crc kubenswrapper[4810]: I1003 09:17:17.872515 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerStarted","Data":"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442"} Oct 03 09:17:17 crc kubenswrapper[4810]: I1003 09:17:17.895868 4810 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x6lpj" podStartSLOduration=2.259601949 podStartE2EDuration="4.895846195s" podCreationTimestamp="2025-10-03 09:17:13 +0000 UTC" firstStartedPulling="2025-10-03 09:17:14.846298974 +0000 UTC m=+8468.273549709" lastFinishedPulling="2025-10-03 09:17:17.48254322 +0000 UTC m=+8470.909793955" observedRunningTime="2025-10-03 09:17:17.892101895 +0000 UTC m=+8471.319352630" watchObservedRunningTime="2025-10-03 09:17:17.895846195 +0000 UTC m=+8471.323096940" Oct 03 09:17:23 crc kubenswrapper[4810]: I1003 09:17:23.623284 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:23 crc kubenswrapper[4810]: I1003 09:17:23.624103 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:23 crc kubenswrapper[4810]: I1003 09:17:23.667116 4810 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:23 crc kubenswrapper[4810]: I1003 09:17:23.964168 4810 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:24 crc kubenswrapper[4810]: I1003 09:17:24.005972 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:25 crc kubenswrapper[4810]: I1003 09:17:25.928139 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x6lpj" podUID="aef3b216-6644-41b9-97ea-96ca1d07fe56" containerName="registry-server" containerID="cri-o://6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442" gracePeriod=2 Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.725401 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-sjq8r_8d16767b-6c32-442f-b6b6-6c6cc78e2c25/nmstate-console-plugin/0.log" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.878151 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8ztlc_98693660-7374-45d1-bc6c-677bb3532d3c/nmstate-handler/0.log" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.901816 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.936396 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerDied","Data":"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442"} Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.936468 4810 scope.go:117] "RemoveContainer" containerID="6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.936469 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x6lpj" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.936591 4810 generic.go:334] "Generic (PLEG): container finished" podID="aef3b216-6644-41b9-97ea-96ca1d07fe56" containerID="6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442" exitCode=0 Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.936641 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x6lpj" event={"ID":"aef3b216-6644-41b9-97ea-96ca1d07fe56","Type":"ContainerDied","Data":"4d21ea985f9f6038ff8b738118e40c835a3d6c01dea8591795a308d0d81d609a"} Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.966324 4810 scope.go:117] "RemoveContainer" containerID="29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.984923 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fc6mm_27a427fb-1f71-4935-89e7-764400c772c9/nmstate-metrics/0.log" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.984870 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-fc6mm_27a427fb-1f71-4935-89e7-764400c772c9/kube-rbac-proxy/0.log" Oct 03 09:17:26 crc kubenswrapper[4810]: I1003 09:17:26.996484 4810 scope.go:117] "RemoveContainer" containerID="224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022117 4810 scope.go:117] "RemoveContainer" containerID="6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442" Oct 03 09:17:27 crc kubenswrapper[4810]: E1003 09:17:27.022435 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442\": container with ID starting with 6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442 not found: ID does not exist" containerID="6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022464 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442"} err="failed to get container status \"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442\": rpc error: code = NotFound desc = could not find container \"6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442\": container with ID starting with 6d1a5f6cc5094b60036090c111480c737b47501020b645839adcb59a67473442 not found: ID does not exist" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022484 4810 scope.go:117] "RemoveContainer" containerID="29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15" Oct 03 09:17:27 crc kubenswrapper[4810]: E1003 09:17:27.022659 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15\": container with ID starting with 29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15 not found: ID does not exist" containerID="29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022679 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15"} err="failed to get container status \"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15\": rpc error: code = NotFound desc = could not find container \"29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15\": container with ID starting with 29b66052758a8f8a9bce6d84097e15b2bad27c06a627a6251066a9ceb616cf15 not found: ID does not exist" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022692 4810 scope.go:117] "RemoveContainer" containerID="224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63" Oct 03 09:17:27 crc kubenswrapper[4810]: E1003 09:17:27.022868 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63\": container with ID starting with 224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63 not found: ID does not exist" containerID="224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.022911 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63"} err="failed to get container status \"224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63\": rpc error: code = NotFound desc = could not find container \"224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63\": container with ID starting with 224ae725d80424595c195a630791fe53de178385c06d683fb7f75d466e304d63 not found: ID does not exist" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.084293 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content\") pod \"aef3b216-6644-41b9-97ea-96ca1d07fe56\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.084377 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbmd\" (UniqueName: \"kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd\") pod \"aef3b216-6644-41b9-97ea-96ca1d07fe56\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.084508 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities\") pod \"aef3b216-6644-41b9-97ea-96ca1d07fe56\" (UID: \"aef3b216-6644-41b9-97ea-96ca1d07fe56\") " Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.085318 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities" (OuterVolumeSpecName: "utilities") pod "aef3b216-6644-41b9-97ea-96ca1d07fe56" (UID: "aef3b216-6644-41b9-97ea-96ca1d07fe56"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.102056 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd" (OuterVolumeSpecName: "kube-api-access-7pbmd") pod "aef3b216-6644-41b9-97ea-96ca1d07fe56" (UID: "aef3b216-6644-41b9-97ea-96ca1d07fe56"). InnerVolumeSpecName "kube-api-access-7pbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.106822 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aef3b216-6644-41b9-97ea-96ca1d07fe56" (UID: "aef3b216-6644-41b9-97ea-96ca1d07fe56"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.186223 4810 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.186276 4810 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef3b216-6644-41b9-97ea-96ca1d07fe56-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.186319 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbmd\" (UniqueName: \"kubernetes.io/projected/aef3b216-6644-41b9-97ea-96ca1d07fe56-kube-api-access-7pbmd\") on node \"crc\" DevicePath \"\"" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.225922 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-9qx25_332c430e-f6e1-4068-8b12-d0a8bc3e1183/nmstate-webhook/0.log" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.263581 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.268378 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x6lpj"] Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.307002 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:17:27 crc kubenswrapper[4810]: E1003 09:17:27.307583 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.316469 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef3b216-6644-41b9-97ea-96ca1d07fe56" path="/var/lib/kubelet/pods/aef3b216-6644-41b9-97ea-96ca1d07fe56/volumes" Oct 03 09:17:27 crc kubenswrapper[4810]: I1003 09:17:27.475546 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pcfqk_1a6bb43f-fea9-4fd4-86c6-50fe6845ec24/nmstate-operator/0.log" Oct 03 09:17:41 crc kubenswrapper[4810]: I1003 09:17:41.302765 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:17:41 crc kubenswrapper[4810]: E1003 09:17:41.303645 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.075966 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-hvl4d_882eba24-ec61-49fd-8048-32ded4a05a45/kube-rbac-proxy/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.624748 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-frr-files/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.761334 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-frr-files/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.811147 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-reloader/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.842494 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-hvl4d_882eba24-ec61-49fd-8048-32ded4a05a45/controller/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.875228 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-metrics/0.log" Oct 03 09:17:42 crc kubenswrapper[4810]: I1003 09:17:42.951410 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-reloader/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.140858 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-frr-files/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.148451 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-metrics/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.173165 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-metrics/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.185436 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-reloader/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.371621 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-frr-files/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.376832 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-reloader/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.382194 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/cp-metrics/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.437444 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/controller/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.568744 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/kube-rbac-proxy/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.587808 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/frr-metrics/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.618044 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/kube-rbac-proxy-frr/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.753057 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/reloader/0.log" Oct 03 09:17:43 crc kubenswrapper[4810]: I1003 09:17:43.823337 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-zv2bz_d3603911-89b0-4e7e-bec3-f7496980b97a/frr-k8s-webhook-server/0.log" Oct 03 09:17:44 crc kubenswrapper[4810]: I1003 09:17:44.033054 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7d4655dff5-sw47j_bacb13e1-0d46-40d5-96a0-e765fdfd6a2c/manager/0.log" Oct 03 09:17:44 crc kubenswrapper[4810]: I1003 09:17:44.284688 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-86c9c5f47f-wwgnb_a569ffff-2a47-4dfc-b245-9453c87786bf/webhook-server/0.log" Oct 03 09:17:44 crc kubenswrapper[4810]: I1003 09:17:44.389809 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ln5cj_66aee647-ecf6-432b-95a9-6f2bcf7cf9cf/kube-rbac-proxy/0.log" Oct 03 09:17:45 crc kubenswrapper[4810]: I1003 09:17:45.573732 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ln5cj_66aee647-ecf6-432b-95a9-6f2bcf7cf9cf/speaker/0.log" Oct 03 09:17:46 crc kubenswrapper[4810]: I1003 09:17:46.390924 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h7lfx_e16cedf7-6d0e-4161-9ffb-1ba8ad4eaee9/frr/0.log" Oct 03 09:17:56 crc kubenswrapper[4810]: I1003 09:17:56.302735 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:17:56 crc kubenswrapper[4810]: E1003 09:17:56.303413 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.145982 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/util/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.367584 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/util/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.369306 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/pull/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.414008 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/pull/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.583117 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/util/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.603694 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/extract/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.649576 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69gjp9l_8db4abc8-dfa1-444e-a0bc-5103e7c66782/pull/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.760403 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/util/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.916780 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/util/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.938601 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/pull/0.log" Oct 03 09:17:57 crc kubenswrapper[4810]: I1003 09:17:57.946857 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/pull/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.120972 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/util/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.154889 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/extract/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.175415 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2d6zjk_c5b60a1d-b331-43fa-93ae-88e9bf8af024/pull/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.324038 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/util/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.516150 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/pull/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.527943 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/util/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.540873 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/pull/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.659019 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/util/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.735393 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/extract/0.log" Oct 03 09:17:58 crc kubenswrapper[4810]: I1003 09:17:58.895060 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvw6zs_f7564445-9319-4f4c-8b03-4d98503a9164/pull/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.016909 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-utilities/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.198020 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-content/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.201036 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-utilities/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.240078 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-content/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.397657 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-content/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.488688 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/extract-utilities/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.668582 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-utilities/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.876909 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-content/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.914859 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-content/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.981883 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-utilities/0.log" Oct 03 09:17:59 crc kubenswrapper[4810]: I1003 09:17:59.993132 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4t2nc_6da6ea85-e420-40f6-829c-5983a746b478/registry-server/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.088449 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-utilities/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.111267 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/extract-content/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.299945 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6gdwv_437239f6-563b-41e0-8895-3d1804c033f8/registry-server/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.328390 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/util/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.479954 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/pull/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.526111 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/util/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.526490 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/pull/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.674249 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/pull/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.698765 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/util/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.703863 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c7skj2_dbf4ec42-1c90-4cc4-a6da-33d98660eea3/extract/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.857139 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-mxqpd_68a22413-3bbe-475d-a98e-09c99becb176/marketplace-operator/0.log" Oct 03 09:18:00 crc kubenswrapper[4810]: I1003 09:18:00.891508 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.052074 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.065087 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-content/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.095386 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-content/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.231978 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-content/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.289185 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.313740 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.507696 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-content/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.531128 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-content/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.544965 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.585508 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9w6dm_c238f470-c89c-4992-8bb7-ad1c2cd58553/registry-server/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.745288 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-utilities/0.log" Oct 03 09:18:01 crc kubenswrapper[4810]: I1003 09:18:01.751152 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/extract-content/0.log" Oct 03 09:18:02 crc kubenswrapper[4810]: I1003 09:18:02.715648 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qfw9w_276bc9fb-9280-4ec8-8f8a-32f482040f97/registry-server/0.log" Oct 03 09:18:07 crc kubenswrapper[4810]: I1003 09:18:07.308192 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:18:07 crc kubenswrapper[4810]: E1003 09:18:07.308945 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:18:12 crc kubenswrapper[4810]: I1003 09:18:12.750681 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-kg8d2_8cea4c70-0056-479e-8759-eda2be569ee6/prometheus-operator/0.log" Oct 03 09:18:12 crc kubenswrapper[4810]: I1003 09:18:12.925185 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d8dd4df-nq646_322bc19c-a989-4073-bd2d-166be2c156d9/prometheus-operator-admission-webhook/0.log" Oct 03 09:18:12 crc kubenswrapper[4810]: I1003 09:18:12.997799 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-68d8dd4df-rv8xk_10a017e0-9508-4ba6-8107-f7eda7c39515/prometheus-operator-admission-webhook/0.log" Oct 03 09:18:13 crc kubenswrapper[4810]: I1003 09:18:13.131796 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-krn6x_81f029f4-aa74-4b25-a6b5-b50a5933feec/operator/0.log" Oct 03 09:18:13 crc kubenswrapper[4810]: I1003 09:18:13.200518 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-lsjsz_70cc7fd9-2b69-41fc-be41-569b6c564807/perses-operator/0.log" Oct 03 09:18:18 crc kubenswrapper[4810]: I1003 09:18:18.303396 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:18:18 crc kubenswrapper[4810]: E1003 09:18:18.303949 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.275631 4810 scope.go:117] "RemoveContainer" containerID="4a43af4fc1e60e505d5beb2a5ed97dbb636ccb1deeb1559ec83a92891659f7d8" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.315615 4810 scope.go:117] "RemoveContainer" containerID="e40f124aa7b826bdb9776fd908df831935f9c765b9d0d3fec510488278301c94" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.341571 4810 scope.go:117] "RemoveContainer" containerID="ea6defe8bf672751992101e754e07e04f6da09fc040a2ef4b978763f323a1cfa" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.371713 4810 scope.go:117] "RemoveContainer" containerID="4bc03279b3229cbb4076beb0e89512d204882c5eb931d0c3923452f7172630da" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.404542 4810 scope.go:117] "RemoveContainer" containerID="6c7a7c61b7e6749bd7310b1f5a4503ed0964eabf688b3f9038e92660a8c8e783" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.429486 4810 scope.go:117] "RemoveContainer" containerID="45a1515c2acae09dfee0ed6214a4b847e7576925c5609e90af82c1960e172c1a" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.450922 4810 scope.go:117] "RemoveContainer" containerID="e23cb5ec44af577e88e6c384a440a4a119a181d40b53ef85b850150499c498d8" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.482050 4810 scope.go:117] "RemoveContainer" containerID="ce0bf123079fad6602177f98443c87294f51896340fb885890994936a2b8dca1" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.496207 4810 scope.go:117] "RemoveContainer" containerID="91ed729b353c6b324966ca1e7a28e60053f9e4bae84eac39f377089ebb2b27eb" Oct 03 09:18:30 crc kubenswrapper[4810]: I1003 09:18:30.514376 4810 scope.go:117] "RemoveContainer" containerID="3ef0c263e1c8f4ced443d655ece4132e18690001db86325aea223aa3d58b358e" Oct 03 09:18:32 crc kubenswrapper[4810]: I1003 09:18:32.302111 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:18:32 crc kubenswrapper[4810]: E1003 09:18:32.302703 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:18:45 crc kubenswrapper[4810]: I1003 09:18:45.302376 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:18:45 crc kubenswrapper[4810]: E1003 09:18:45.303612 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:18:56 crc kubenswrapper[4810]: I1003 09:18:56.302835 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:18:56 crc kubenswrapper[4810]: E1003 09:18:56.303643 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:19:10 crc kubenswrapper[4810]: I1003 09:19:10.302638 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:19:10 crc kubenswrapper[4810]: E1003 09:19:10.303290 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:19:22 crc kubenswrapper[4810]: I1003 09:19:22.303325 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:19:22 crc kubenswrapper[4810]: E1003 09:19:22.304648 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:19:33 crc kubenswrapper[4810]: I1003 09:19:33.302769 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:19:33 crc kubenswrapper[4810]: E1003 09:19:33.303523 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:19:38 crc kubenswrapper[4810]: I1003 09:19:38.048717 4810 generic.go:334] "Generic (PLEG): container finished" podID="1515b15c-c878-43e5-8048-aa5c72ac62a6" containerID="c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6" exitCode=0 Oct 03 09:19:38 crc kubenswrapper[4810]: I1003 09:19:38.048789 4810 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" event={"ID":"1515b15c-c878-43e5-8048-aa5c72ac62a6","Type":"ContainerDied","Data":"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6"} Oct 03 09:19:38 crc kubenswrapper[4810]: I1003 09:19:38.049955 4810 scope.go:117] "RemoveContainer" containerID="c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6" Oct 03 09:19:38 crc kubenswrapper[4810]: I1003 09:19:38.269722 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hfgcf_must-gather-6kcxw_1515b15c-c878-43e5-8048-aa5c72ac62a6/gather/0.log" Oct 03 09:19:45 crc kubenswrapper[4810]: I1003 09:19:45.302885 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:19:45 crc kubenswrapper[4810]: E1003 09:19:45.303630 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:19:46 crc kubenswrapper[4810]: I1003 09:19:46.605570 4810 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hfgcf/must-gather-6kcxw"] Oct 03 09:19:46 crc kubenswrapper[4810]: I1003 09:19:46.605803 4810 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" podUID="1515b15c-c878-43e5-8048-aa5c72ac62a6" containerName="copy" containerID="cri-o://2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0" gracePeriod=2 Oct 03 09:19:46 crc kubenswrapper[4810]: I1003 09:19:46.611737 4810 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hfgcf/must-gather-6kcxw"] Oct 03 09:19:46 crc kubenswrapper[4810]: I1003 09:19:46.986048 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hfgcf_must-gather-6kcxw_1515b15c-c878-43e5-8048-aa5c72ac62a6/copy/0.log" Oct 03 09:19:46 crc kubenswrapper[4810]: I1003 09:19:46.987332 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.022418 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpcg8\" (UniqueName: \"kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8\") pod \"1515b15c-c878-43e5-8048-aa5c72ac62a6\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.022528 4810 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output\") pod \"1515b15c-c878-43e5-8048-aa5c72ac62a6\" (UID: \"1515b15c-c878-43e5-8048-aa5c72ac62a6\") " Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.033159 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8" (OuterVolumeSpecName: "kube-api-access-tpcg8") pod "1515b15c-c878-43e5-8048-aa5c72ac62a6" (UID: "1515b15c-c878-43e5-8048-aa5c72ac62a6"). InnerVolumeSpecName "kube-api-access-tpcg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.124527 4810 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpcg8\" (UniqueName: \"kubernetes.io/projected/1515b15c-c878-43e5-8048-aa5c72ac62a6-kube-api-access-tpcg8\") on node \"crc\" DevicePath \"\"" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.149679 4810 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hfgcf_must-gather-6kcxw_1515b15c-c878-43e5-8048-aa5c72ac62a6/copy/0.log" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.150131 4810 generic.go:334] "Generic (PLEG): container finished" podID="1515b15c-c878-43e5-8048-aa5c72ac62a6" containerID="2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0" exitCode=143 Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.150215 4810 scope.go:117] "RemoveContainer" containerID="2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.150404 4810 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hfgcf/must-gather-6kcxw" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.163380 4810 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1515b15c-c878-43e5-8048-aa5c72ac62a6" (UID: "1515b15c-c878-43e5-8048-aa5c72ac62a6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.171429 4810 scope.go:117] "RemoveContainer" containerID="c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.225421 4810 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1515b15c-c878-43e5-8048-aa5c72ac62a6-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.230029 4810 scope.go:117] "RemoveContainer" containerID="2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0" Oct 03 09:19:47 crc kubenswrapper[4810]: E1003 09:19:47.231364 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0\": container with ID starting with 2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0 not found: ID does not exist" containerID="2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.231400 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0"} err="failed to get container status \"2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0\": rpc error: code = NotFound desc = could not find container \"2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0\": container with ID starting with 2020354af3cc1cb9278eb834869df5cd2bf2b6bec9d9c3b97bae8bb252e19bf0 not found: ID does not exist" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.231424 4810 scope.go:117] "RemoveContainer" containerID="c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6" Oct 03 09:19:47 crc kubenswrapper[4810]: E1003 09:19:47.231676 4810 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6\": container with ID starting with c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6 not found: ID does not exist" containerID="c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.231721 4810 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6"} err="failed to get container status \"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6\": rpc error: code = NotFound desc = could not find container \"c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6\": container with ID starting with c62d333e05580bf32830f78dcdcfa4ab1052446c8767c5251b160ca79a5658d6 not found: ID does not exist" Oct 03 09:19:47 crc kubenswrapper[4810]: I1003 09:19:47.311588 4810 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1515b15c-c878-43e5-8048-aa5c72ac62a6" path="/var/lib/kubelet/pods/1515b15c-c878-43e5-8048-aa5c72ac62a6/volumes" Oct 03 09:20:00 crc kubenswrapper[4810]: I1003 09:20:00.303215 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:20:00 crc kubenswrapper[4810]: E1003 09:20:00.304180 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:20:11 crc kubenswrapper[4810]: I1003 09:20:11.302683 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:20:11 crc kubenswrapper[4810]: E1003 09:20:11.303482 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:20:23 crc kubenswrapper[4810]: I1003 09:20:23.303227 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:20:23 crc kubenswrapper[4810]: E1003 09:20:23.303901 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:20:30 crc kubenswrapper[4810]: I1003 09:20:30.653830 4810 scope.go:117] "RemoveContainer" containerID="296ed0750056f73d3fc06e4f5a26871e9848e7739cf2c173227fdf91ece092be" Oct 03 09:20:34 crc kubenswrapper[4810]: I1003 09:20:34.303044 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:20:34 crc kubenswrapper[4810]: E1003 09:20:34.303759 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:20:48 crc kubenswrapper[4810]: I1003 09:20:48.302497 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:20:48 crc kubenswrapper[4810]: E1003 09:20:48.303272 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:21:01 crc kubenswrapper[4810]: I1003 09:21:01.302090 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:21:01 crc kubenswrapper[4810]: E1003 09:21:01.302936 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930" Oct 03 09:21:13 crc kubenswrapper[4810]: I1003 09:21:13.302336 4810 scope.go:117] "RemoveContainer" containerID="53e7e01fd393982e9d5cf0fad1086628e6c23cb26ce4df1c3e2c3d2ddf76bcd6" Oct 03 09:21:13 crc kubenswrapper[4810]: E1003 09:21:13.303097 4810 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-z8f25_openshift-machine-config-operator(e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930)\"" pod="openshift-machine-config-operator/machine-config-daemon-z8f25" podUID="e12d3cfb-2ba7-4eb6-b6b4-bfc4cec93930"